Model Context Protocol (MCP) Server

Model Context Protocol (MCP) Server

This server facilitates the invocation of AI models from providers like Anthropic, OpenAI, and Groq, enabling users to manage and configure large language model interactions seamlessly.

hideya

Cloud Platforms
Python
Visit Server

README

MCP Client Using LangChain / Python License: MIT

This simple Model Context Protocol (MCP) client demonstrates the use of MCP server tools by LangChain ReAct Agent.

It leverages a utility function convert_mcp_to_langchain_tools() from langchain_mcp_tools.
This function handles parallel initialization of specified multiple MCP servers and converts their available tools into a list of LangChain-compatible tools (List[BaseTool]).

LLMs from Anthropic, OpenAI and Groq are currently supported.

A typescript version of this MCP client is available here

Prerequisites

  • Python 3.11+
  • [optional] uv (uvx) installed to run Python package-based MCP servers
  • [optional] npm 7+ (npx) to run Node.js package-based MCP servers
  • API keys from Anthropic, OpenAI, and/or Groq as needed

Setup

  1. Install dependencies:

    make install
    
  2. Setup API keys:

    cp .env.template .env
    
    • Update .env as needed.
    • .gitignore is configured to ignore .env to prevent accidental commits of the credentials.
  3. Configure LLM and MCP Servers settings llm_mcp_config.json5 as needed.

    • The configuration file format for MCP servers follows the same structure as Claude for Desktop, with one difference: the key name mcpServers has been changed to mcp_servers to follow the snake_case convention commonly used in JSON configuration files.
    • The file format is JSON5, where comments and trailing commas are allowed.
    • The format is further extended to replace ${...} notations with the values of corresponding environment variables.
    • Keep all the credentials and private info in the .env file and refer to them with ${...} notation as needed.

Usage

Run the app:

make start

It takes a while on the first run.

Run in verbose mode:

make start-v

See commandline options:

make start-h

At the prompt, you can simply press Enter to use example queries that perform MCP server tool invocations.

Example queries can be configured in llm_mcp_config.json5

Recommended Servers

Tavily MCP Server

Tavily MCP Server

Provides AI-powered web search capabilities using Tavily's search API, enabling LLMs to perform sophisticated web searches, get direct answers to questions, and search recent news articles.

Featured
Python
contentful-mcp

contentful-mcp

Update, create, delete content, content-models and assets in your Contentful Space

Featured
TypeScript
YouTube Transcript MCP Server

YouTube Transcript MCP Server

This server retrieves transcripts for given YouTube video URLs, enabling integration with Goose CLI or Goose Desktop for transcript extraction and processing.

Featured
Python
Supabase MCP Server

Supabase MCP Server

A Model Context Protocol (MCP) server that provides programmatic access to the Supabase Management API. This server allows AI models and other clients to manage Supabase projects and organizations through a standardized interface.

Featured
JavaScript
DuckDuckGo MCP Server

DuckDuckGo MCP Server

A Model Context Protocol (MCP) server that provides web search capabilities through DuckDuckGo, with additional features for content fetching and parsing.

Featured
Python
Brev

Brev

Run, build, train, and deploy ML models on the cloud.

Official
Local
Python
Azure MCP Server

Azure MCP Server

Enables natural language interaction with Azure services through Claude Desktop, supporting resource management, subscription handling, and tenant selection with secure authentication.

Official
Local
TypeScript
SettleMint

SettleMint

Leverage SettleMint's Model Context Protocol server to seamlessly interact with enterprise blockchain infrastructure. Build, deploy, and manage smart contracts through AI-powered assistants, streamlining your blockchain development workflow for maximum efficiency.

Official
Local
TypeScript
ScrapeGraph MCP Server

ScrapeGraph MCP Server

A production-ready Model Context Protocol server that enables language models to leverage AI-powered web scraping capabilities, offering tools for transforming webpages to markdown, extracting structured data, and executing AI-powered web searches.

Official
Python
Nefino MCP Server

Nefino MCP Server

Provides large language models with access to news and information about renewable energy projects in Germany, allowing filtering by location, topic (solar, wind, hydrogen), and date range.

Official
Python