Grok MCP

Grok MCP

Use XAI's latest api functionalities with Grok MCP. It supports image understanding and generation, live search, latest models and more.

Category
Visit Server

README

Grok-MCP

A MCP server for xAI's Grok API, providing access to capabilities including image understanding, image generation, live web search, and reasoning models.

<a href="https://glama.ai/mcp/servers/@merterbak/Grok-MCP"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@merterbak/Grok-MCP/badge" /> </a>

🚀 Features

  • Multiple Grok Models: Access to Grok-4, Grok-4-Fast, Grok-3-Mini, and more
  • Image Generation: Create images using Grok's image generation models
  • Vision Capabilities: Analyze images with Grok's vision models
  • Live Web Search: Real-time web search with citations from news, web, X, and RSS feeds
  • Reasoning Models: Advanced reasoning with extended thinking models (Grok-3-Mini, Grok-4)
  • Stateful Conversations: Use this nrewly released feature to maintain conversation context as id across multiple requests
  • Conversation History: Built-in support for multi-turn conversations

📋 Prerequisites

  • Python 3.11 or higher
  • xAI API key (Get one here)
  • uv package manager

🛠️ Installation

  1. Clone the repository:
git clone https://github.com/merterbak/Grok-MCP.git
cd Grok-MCP
  1. Install dependencies using uv:
uv sync

🔧 Configuration

Claude Desktop Integration

Add this to your Claude Desktop configuration file:

{
  "mcpServers": {
    "grok": {
      "command": "uv",
      "args": [
        "--directory",
        "/path/to/Grok-MCP",
        "run",
        "python",
        "main.py"
      ],
      "env": {
        "XAI_API_KEY": "your_api_key_here"
      }
    }
  }
}

Usage

For stdio:

uv run python main.py

📚 Available Tools

1. list_models

List all available Grok models with creation dates and ownership information.

2. chat

Standard chat completion with extensive customization options.

Parameters:

  • prompt (required): Your message
  • model: Model to use (default: "grok-4-fast")
  • system_prompt: Optional system instruction
  • use_conversation_history: Enable multi-turn conversations
  • temperature, max_tokens, top_p: Generation parameters
  • presence_penalty, frequency_penalty, stop: Advanced control
  • reasoning_effort: For reasoning models ("low" or "high")

3. chat_with_reasoning

Get detailed reasoning along with the response.

Parameters:

  • prompt (required): Your question or task
  • model: "grok-4", "grok-3-mini", or "grok-3-mini-fast"
  • reasoning_effort: "low" or "high" (not for grok-4)
  • system_prompt, temperature, max_tokens, top_p

Returns: Content, reasoning content, and usage statistics

4. chat_with_vision

Analyze images with natural language queries.

Parameters:

  • prompt (required): Your question about the image(s)
  • image_paths: List of local image file paths
  • image_urls: List of image URLs
  • detail: "auto", "low", or "high"
  • model: Vision-capable model (default: "grok-4-0709")

Supported formats: JPG, JPEG, PNG

5. generate_image

Create images from text descriptions.

Parameters:

  • prompt (required): Image description
  • n: Number of images to generate (default: 1)
  • response_format: "url" or "b64_json"
  • model: Image generation model (default: "grok-2-image-1212")

Returns: Generated images and revised prompt

6. live_search

Search the web in real-time with source citations.

Parameters:

  • prompt (required): Your search query
  • model: Model to use (default: "grok-4")
  • mode: "on" or "off"
  • return_citations: Include source citations (default: true)
  • from_date, to_date: Date range (YYYY-MM-DD)
  • max_search_results: Max results to fetch (default: 20)
  • country: Country code for localized search
  • rss_links: List of RSS feed URLs to search
  • sources: Custom source configuration

Returns: Content, citations, usage stats, and number of sources used

7. stateful_chat

Maintain conversation state across multiple requests on xAI servers.

Parameters:

  • prompt (required): Your message
  • response_id: Previous response ID to continue conversation
  • model: Model to use (default: "grok-4")
  • system_prompt: System instruction (only for new conversations)
  • include_reasoning: Include reasoning summary
  • temperature, max_tokens

Returns: Response with ID for continuing the conversation (stored for 30 days)

8. retrieve_stateful_response

Retrieve a previously stored conversation response.

Parameters:

  • response_id (required): The response ID to retrieve

9. delete_stateful_response

Delete a stored conversation from xAI servers.

Parameters:

  • response_id (required): The response ID to delete

Roadmap

  • add docker support
  • fix chat vision model tool

📄 License

This project is open source and available under the MIT License.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured