
Chris MCP
A context provider that serves as an AI version of Chris's programming knowledge and practices. Enables AI utilities like Claude and Cline to search and access coding guidelines, rules, and context for JavaScript, TypeScript, React, and various development frameworks.
README
Chris MCP
A context provider on how I program. Basically the AI version of me for AI utilities like cline.
1. Install
The following sections describe several ways to install this MCP.
Make sure you are using Node version 22.
1.1. Option 1: Using NPX
Run the following commands in the same folder your other MCP servers are.
$ mkdir chris-mcp
$ cd chris-mcp
$ npx --y chris-mcp fetch --output ./data
$ npx --y chris-mcp verify --output ./data
$ pwd
Copy the response from pwd
and edit your MCP server configuration by following one of the options below.
1.1.1 Using NPX With Claude Desktop
Add the following configuration to your claude_desktop_config.json
where [pwd]
is the response from the pwd
command earlier.
{
"name": "github.com/cblanquera/mcp",
"command": "npx",
"args": [
"-y",
"chris-mcp",
"serve",
"--input",
"[pwd]/data"
]
}
1.1.2 Using NPX With Cline
Add the following configuration to your cline_mcp_settings.json
where [pwd]
is the response from the pwd
command earlier.
{
"mcpServers": {
"github.com/cblanquera/mcp": {
"command": "npx",
"args": [
"-y",
"chris-mcp",
"serve",
"--input",
"[pwd]/data"
]
}
}
}
1.2. Option 2: Direct From the Repository
Run the following commands in the same folder your other MCP servers are.
$ git clone https://github.com/cblanquera/mcp.git chris-mcp
$ cd chris-mcp
$ npm i
$ npm run build
$ npm run fetch --output ./data
$ npm run verify --output ./data
$ pwd
Copy the response from pwd
and edit your MCP server configuration by following one of the options below.
1.2.1. From the Repository With Claude Desktop
Add the following configuration to your claude_desktop_config.json
.
{
"name": "github.com/cblanquera/mcp",
"command": "node",
"args": [
"[pwd]/dist/scripts/serve.js",
"--input",
"[pwd]/data"
]
}
1.2.2. From the Repository With Cline
Add the following configuration to your cline_mcp_settings.json
.
{
"mcpServers": {
"github.com/cblanquera/mcp": {
"command": "node",
"args": [
"[pwd]/dist/scripts/serve.js",
"--input",
"[pwd]/data"
]
}
}
}
1.3. From Prompt
- Copy and paste the following prompt.
Set up the MCP server from https://github.com/cblanquera/mcp while adhering to these MCP server installation rules:
- Start by loading the MCP documentation.
- Use "github.com/cblanquera/mcp" as the server name in cline_mcp_settings.json.
- Create the directory for the new MCP server before starting installation.
- Make sure you read the user's existing cline_mcp_settings.json file before editing it with this new mcp, to not overwrite any existing servers.
- Use commands aligned with the user's shell and operating system best practices.
- Once installed, demonstrate the server's capabilities by using one of its tools.
Here is the project's README to help you get started:
- Then paste in this README.
2. Usage
You can manually start the server like the following.
$ npm start --input [pwd]/data
Or you can run it manually like the following.
$ node [pwd]/dist/scripts/serve.js --input [pwd]/data
If you installed via npx
, you can start the server like the following.
$ npx chris-mcp serve --input [pwd]/data
2.1. Fetching Updated Context
You can manually fetch and verify the context like the following.
$ npm run fetch --output [pwd]/data
$ npm run verify --output [pwd]/data
Or you can run it manually like the following.
$ node [pwd]/dist/scripts/fetch.js --output [pwd]/data
$ node [pwd]/dist/scripts/verify.js --output [pwd]/data
If you installed via npx
, you can start the server like the following.
$ npx chris-mcp fetch --output [pwd]/data
$ npx chris-mcp verify --output [pwd]/data
2.2. Upgrading Search Model
The MCP uses Xenova/all-MiniLM-L6-v2
locally to determine the best search query term for the MCP. Think about it like random prompt → correct query → ask MCP. You can upgrade this to use your OpenAI key by adding OPENAI_HOST
, OPENAI_KEY
and EMBEDDING_MODEL
environment variables in your MCP settings like the following.
{
"name": "chris-context",
"command": "node",
"command": "npx",
"args": [
"-y",
"chris-mcp",
"serve",
"--input",
"[pwd]/data"
],
"env": {
"OPENAI_HOST": "https://api.openai.com/v1",
"OPENAI_KEY": "sk-xxx",
"EMBEDDING_MODEL": "text-embedding-3-small"
}
}
WARNING: OpenRouter doesn't support the
/embeddings
API endpoint. This is called when providing an OpenAI compatible host.
3. Maximizing Your Knowledge Base
Create a rule (markdown file) called Chris-MCP-Rule.md in your knowledge folder (ex. .clinerules
) with the following context.
# Rule: Using the Chris MCP
- If the user mentions "chris" and asks about code formatting, coding styles, coding standards, documentation styles, testing styles, use the MCP tool `chris-context.search_context`.
- If the user asks for a compact summary of rules for code formatting, writing documentation, writing tests, use the MCP tool `chris-context.build_brief`.
- Always prefer these MCP tools over answering from memory.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.