Demo MCP Basic
Demo of MCP server with HTTP SSE and a client
bertrandgressier
README
Demo MCP Basic
This project demonstrates a fundamental client-server interaction using the Model Context Protocol (MCP). MCP allows AI models, like those accessed via the AI SDK (e.g., Google Gemini/Vertex AI), to securely discover and utilize external tools or resources provided by a separate server process.
In this example:
- The Server (
src/server/
) acts as an MCP provider, offering simple calculation tools (addition, subtraction, etc.). - The Client (
src/client/
) uses the AI SDK to interact with a Google AI model and connects to the MCP server to make the server's tools available to the AI during generation.
This setup illustrates how you can extend the capabilities of AI models by giving them access to custom functionalities hosted on an MCP server.
Prerequisites
- Node.js (Version >=23.0.0 as specified in
package.json
) - npm (comes with Node.js)
- A Google AI (Gemini) API Key. You can obtain one from Google AI Studio. (Required for using Gemini models).
Setup
-
Clone the repository:
git clone git@github.com:bertrandgressier/demo-ts-mcp-client-server.git cd demo-mcp-basic
-
Install dependencies:
npm install
-
Create Environment File: Copy the example environment file
.env.example
to a new file named.env
:cp .env.example .env
Then, edit the
.env
file to add your actual API keys and configuration. The required and optional variables are:# Required for Google Studio Models (Gemini) GOOGLE_API_KEY=YOUR_GOOGLE_API_KEY # Required for Google Vertex AI Models VERTEX_PROJECT_ID=YOUR_VERTEX_PROJECT_ID # Optional: Defaults to 'us-central1' if not set # VERTEX_LOCATION=your-vertex-location # Optional: Path to Vertex AI service account key file. Defaults to 'vertex-key.json' in the root if not set. # VERTEX_KEY_FILE=path/to/your/vertex-key.json
Replace
YOUR_GOOGLE_API_KEY
andYOUR_VERTEX_PROJECT_ID
with your actual credentials. If you use a Vertex AI key file, ensure it's placed correctly (e.g., in the root asvertex-key.json
or provide the correct path inVERTEX_KEY_FILE
).- Important: The
.gitignore
file is configured to prevent.env
andvertex-key.json
from being committed to Git.
- Important: The
Available Scripts
-
Build TypeScript:
npm run build
Compiles TypeScript code from
src
to JavaScript indist
. -
Start Production Server:
npm run start # or specifically npm run start:server
Builds the project (if not already built) and runs the compiled server from
dist/server/server.js
. -
Start Production Client:
npm run start:client
Runs the compiled client from
dist/client/client.js
. -
Run Server in Development Mode:
npm run dev:server
Runs the server directly using
ts-node
(or similar via--experimental-transform-types
) without needing a separate build step. -
Run Server in Development Mode with Watch:
npm run dev:server:watch
Runs the server using
nodemon
, automatically restarting it when changes are detected in thesrc/server
directory. -
Run Client in Development Mode:
npm run dev:client
Runs the client directly using
ts-node
(or similar).
Example Client (src/client/client.ts
)
The primary example client script demonstrates the following:
- Connects to the MCP Server: Establishes a connection to the locally running MCP server (expected at
http://localhost:3001/sse
). - Fetches Tools: Retrieves the list of tools made available by the connected server.
- Configures AI Model: Uses the AI SDK (
generateText
) configured with a Google AI model (Gemini via API Key or Vertex AI, depending on environment variables set in.env
and potentiallysrc/models/google-ai-provider.ts
). - Executes Prompt: Sends a prompt (
6 + 12
) along with a system message instructing the AI to use the fetched tools to solve the calculation. - Outputs Result: Prints the final text response generated by the AI model after potentially using the server-provided tools.
This client serves as a basic illustration of how an application can interact with an MCP server to leverage its tools within an AI generation flow.
Project Structure
src/
: Contains the TypeScript source code.client/
: Code for the MCP client application.server/
: Code for the MCP server application.model.ts
: Handles AI model initialization and environment variable loading.
dist/
: Contains the compiled JavaScript code (after runningnpm run build
)..env
: Stores environment variables (API keys, etc.) - Do not commit this file.package.json
: Project metadata and dependencies.tsconfig.json
: TypeScript compiler configuration.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.