watsonx MCP Server
Enables Claude to delegate tasks to IBM watsonx.ai foundation models (Granite, Llama, Mistral) for text generation, chat, embeddings, and document analysis. Supports two-agent architectures where Claude can leverage IBM's enterprise AI capabilities for specialized workloads.
README
watsonx MCP Server
MCP server for IBM watsonx.ai integration with Claude Code. Enables Claude to delegate tasks to IBM's foundation models (Granite, Llama, Mistral, etc.).
Features
- Text Generation - Generate text using watsonx.ai foundation models
- Chat - Have conversations with watsonx.ai chat models
- Embeddings - Generate text embeddings
- Model Listing - List all available foundation models
Available Tools
| Tool | Description |
|---|---|
watsonx_generate |
Generate text using watsonx.ai models |
watsonx_chat |
Chat with watsonx.ai models |
watsonx_embeddings |
Generate text embeddings |
watsonx_list_models |
List available models |
Setup
1. Install Dependencies
cd ~/watsonx-mcp-server
npm install
2. Configure Environment
Set these environment variables:
WATSONX_API_KEY=your-ibm-cloud-api-key
WATSONX_URL=https://us-south.ml.cloud.ibm.com
WATSONX_SPACE_ID=your-deployment-space-id # Recommended: deployment space
WATSONX_PROJECT_ID=your-project-id # Alternative: project ID
Note: Either WATSONX_SPACE_ID or WATSONX_PROJECT_ID is required for text generation, embeddings, and chat. Deployment spaces are recommended as they have Watson Machine Learning (WML) pre-configured.
3. Add to Claude Code
The MCP server is already configured in ~/.claude.json:
{
"mcpServers": {
"watsonx": {
"type": "stdio",
"command": "node",
"args": ["/Users/matthewkarsten/watsonx-mcp-server/index.js"],
"env": {
"WATSONX_API_KEY": "your-api-key",
"WATSONX_URL": "https://us-south.ml.cloud.ibm.com",
"WATSONX_SPACE_ID": "your-deployment-space-id"
}
}
}
}
Usage
Once configured, Claude can use watsonx.ai tools:
User: Use watsonx to generate a haiku about coding
Claude: [Uses watsonx_generate tool]
Result: Code flows like water
Bugs arise, then disappear
Programs come alive
Available Models
Some notable models available:
ibm/granite-3-3-8b-instruct- IBM Granite 3.3 8B (recommended)ibm/granite-13b-chat-v2- IBM Granite chat modelibm/granite-3-8b-instruct- Granite 3 instruct modelmeta-llama/llama-3-70b-instruct- Meta's Llama 3 70Bmistralai/mistral-large- Mistral AI large modelibm/slate-125m-english-rtrvr-v2- Embedding model
Use watsonx_list_models to see all available models.
Architecture
Claude Code (Opus 4.5)
│
└──▶ watsonx MCP Server
│
└──▶ IBM watsonx.ai API
│
├── Granite Models
├── Llama Models
├── Mistral Models
└── Embedding Models
Two-Agent System
This enables a two-agent architecture where:
- Claude (Opus 4.5) - Primary reasoning agent, handles complex tasks
- watsonx.ai - Secondary agent for specific workloads
Claude can delegate tasks to watsonx.ai when:
- IBM-specific model capabilities are needed
- Running batch inference on enterprise data
- Using specialized Granite models
- Generating embeddings for RAG pipelines
IBM Cloud Resources
This MCP server uses:
- Service: watsonx.ai Studio (data-science-experience)
- Plan: Lite (free tier)
- Region: us-south
Create your own watsonx.ai project and deployment space in IBM Cloud.
Integration with IBM Z MCP Server
This watsonx MCP server works alongside the IBM Z MCP server:
Claude Code (Opus 4.5)
│
├──▶ watsonx MCP Server
│ └── Text generation, embeddings, chat
│
└──▶ ibmz MCP Server
└── Key Protect HSM, z/OS Connect
Demo scripts in the ibmz-mcp-server:
demo-full-stack.js- Full 5-service pipelinedemo-rag.js- RAG with watsonx embeddings + Granite
Document Analyzer
The document analyzer (document-analyzer.js) provides powerful tools for analyzing your external drive data using watsonx.ai:
Commands
# View document catalog (9,168 documents)
node document-analyzer.js catalog
# Summarize a document
node document-analyzer.js summarize 1002519.txt
# Analyze document type, topics, entities
node document-analyzer.js analyze 1002519.txt
# Ask questions about a document
node document-analyzer.js question 1002519.txt 'What AWS credentials are needed?'
# Generate embeddings for documents
node document-analyzer.js embed
# Semantic search across documents
node document-analyzer.js search 'IBM Cloud infrastructure'
Features
- Summarization: Generate concise summaries of any document
- Analysis: Extract document type, topics, entities, and sentiment
- Q&A: Ask natural language questions about document content
- Embeddings: Generate 768-dimensional vectors for semantic search
- Semantic Search: Find similar documents using vector similarity
Demo
Run the full demo:
./demo-external-drive.sh
Embedding Index & RAG
The embedding-index.js tool provides semantic search and RAG (Retrieval Augmented Generation):
# Build an embedding index (50 documents)
node embedding-index.js build 50
# Semantic search
node embedding-index.js search 'cloud infrastructure'
# RAG query - retrieves relevant docs and generates answer
node embedding-index.js rag 'How do I set up AWS for Satellite?'
# Show index statistics
node embedding-index.js stats
Batch Processor
The batch-processor.js tool processes multiple documents at once:
# Classify documents into categories
node batch-processor.js classify 20
# Extract topics from documents
node batch-processor.js topics 15
# Generate one-line summaries
node batch-processor.js summarize 10
# Full analysis (classify + topics + summary)
node batch-processor.js full 10
Categories: technical, business, creative, personal, code, legal, marketing, educational, other
Files
index.js- MCP server implementationdocument-analyzer.js- Document analysis CLI toolembedding-index.js- Embedding index and RAG toolbatch-processor.js- Batch document processordemo-external-drive.sh- Demo scriptpackage.json- DependenciesREADME.md- This file
Author
Matthew Karsten
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.