News Analysis Agent MCP Server

News Analysis Agent MCP Server

Provides tools for fetching real-time news and performing AI-powered sentiment analysis and summarization using Mistral AI. It enables users to analyze news trends and extract structured insights through natural language queries.

Category
Visit Server

README

mcp-news-analysis-agent

A comprehensive Model Context Protocol (MCP) implementation for intelligent news analysis, featuring advanced LLM-powered sentiment analysis, AI summarization, and natural language query processing using Mistral AI.

🧠 LLM-Powered Architecture

This project leverages Mistral AI for advanced natural language processing capabilities:

  • Sentiment Analysis: Uses Mistral's LLM for nuanced sentiment understanding with confidence scoring
  • Text Summarization: AI-powered content summarization with customizable length and style
  • Intent Detection: Smart query interpretation for natural language interaction
  • Structured Outputs: JSON-formatted responses with detailed reasoning and metadata

šŸš€ Features

  • News Fetching: Retrieve real-time news articles from RapidAPI
  • Sentiment Analysis: Advanced sentiment analysis using Mistral AI with confidence scoring and detailed reasoning
  • Text Summarization: AI-powered summarization using Mistral AI
  • Intelligent Agent: Natural language query processing with enhanced intent detection
  • MCP Architecture: Fully compliant with Model Context Protocol standards

šŸ“‹ Project Structure

MCPDemo/
ā”œā”€ā”€ server/
│   └── mcp_server.py          # MCP server implementation
ā”œā”€ā”€ client/
│   └── mcp_client.py          # Enhanced MCP client with intelligent intent detection and quote parsing
ā”œā”€ā”€ tools/
│   ā”œā”€ā”€ news_tool.py           # News fetching from RapidAPI
│   ā”œā”€ā”€ sentiment_tool.py      # sentiment analysis tools
│   ā”œā”€ā”€ summary_tool.py        # Text summarization tools
│   └── __init__.py
ā”œā”€ā”€ config/
│   ā”œā”€ā”€ settings.py            # Configuration management
│   ā”œā”€ā”€ .env                   # Environment variables
│   └── __init__.py
ā”œā”€ā”€ requirements.txt           # Python dependencies
└── README.md                  # This file

āš™ļø Setup Instructions

1. Environment Setup

First, create a Python virtual environment:

# Navigate to project directory
cd "C:\Users\mayssen\Desktop\mcp project\MCPDemo"

# Create virtual environment
python -m venv .venv

# Activate virtual environment
.\.venv\Scripts\Activate  # Windows
# source .venv/bin/activate  # macOS/Linux

# Install dependencies
pip install -r requirements.txt

2. API Keys Configuration

Edit the config/.env file and add your API keys:

# News API key from RapidAPI (already provided)
RAPIDAPI_KEY=6d35e9aa82msh4c8550ffb3e08b4p15bf78jsna3f5a47eeb4d
RAPIDAPI_HOST=real-time-news-data.p.rapidapi.com

# Get your Mistral AI API key from https://console.mistral.ai/
MISTRAL_API_KEY=your_mistral_api_key_here

# Optional: Adjust other settings as needed
MAX_NEWS_ARTICLES=10
MAX_SUMMARY_LENGTH=500
LOG_LEVEL=INFO

3. Install Additional Dependencies

Some packages might need special installation:

# Ensure Mistral AI client is properly installed
pip install mistralai

# If you encounter any import errors, install packages individually:
pip install httpx langchain-mistralai fastmcp

šŸ”§ Usage

Running the MCP Server

# Make sure you're in the project directory with activated virtual environment
python server/mcp_server.py

Running the Interactive Agent

In a separate terminal:

# Activate the same virtual environment
.\.venv\Scripts\Activate

# Run the client
python client/mcp_client.py

Example Queries

Once the agent is running, try these natural language queries:

- "Get latest news about technology"
- "Analyze sentiment of recent news about climate change"
- "Summarize news about the economy"
- "Show me top 5 news from UK"
- "How do people feel about the latest political news?"
- "Get French news about sports and analyze sentiment"
- "This new AI technology is amazing but also quite expensive" (direct text analysis)

šŸ›  Available Tools

1. News Tool

  • Function: fetch_news
  • Purpose: Fetches news articles from RapidAPI
  • Parameters: topic, country, language, limit
  • Example: Retrieve tech news from US in English

2. Sentiment Analysis Tool

  • Function: analyze_sentiment
  • Purpose: Analyzes sentiment using advanced Mistral AI LLM with confidence scoring and detailed reasoning
  • Parameters: text, analysis_type (simple/detailed)
  • Features:
    • Structured JSON responses with confidence scores
    • Detailed reasoning and emotion detection
    • Support for complex, nuanced sentiment analysis
    • Direct text analysis through quotes
  • Example: Determine sentiment with confidence: "Mixed sentiment (0.80 confidence) - expresses both excitement and concern"

3. Summary Tool (Requires Mistral AI)

  • Function: summarize_text
  • Purpose: Summarizes text using Mistral AI
  • Parameters: text, max_length, summary_type
  • Example: Create brief summaries of long articles

4. Combined Workflows

  • Function: analyze_news_sentiment
  • Purpose: Fetches news and analyzes sentiment in one step
  • Parameters: topic, country, language, limit
  • Example: Get tech news and determine public sentiment

šŸ”Œ Integration with Claude Desktop

To use this server with Claude Desktop, add this to your claude_desktop_config.json:

{
  "mcpServers": {
    "news-analysis": {
      "command": "python",
      "args": ["C:/Users/mayssen/Desktop/mcp project/MCPDemo/server/mcp_server.py"],
      "env": {
        "PYTHONPATH": "C:/Users/mayssen/Desktop/mcp project/MCPDemo"
      }
    }
  }
}

šŸŽÆ Advanced Features

Intelligent Text Detection

The client automatically detects quoted text in user queries and analyzes it directly:

  • Input: "This product is amazing but expensive"
  • Result: Direct sentiment analysis of the quoted text

Structured LLM Responses

All LLM operations return structured JSON with:

  • Classification: Primary sentiment/summary category
  • Confidence: Numerical confidence score (0.0-1.0)
  • Reasoning: Detailed explanation of the analysis
  • Emotions: Additional emotional context (for detailed analysis)

šŸ› Troubleshooting

Common Issues

  1. Import Errors: Make sure all dependencies are installed and virtual environment is activated
  2. Mistral API Key Errors: Verify your Mistral AI API key is correctly set in .env file
  3. RapidAPI Errors: Check if the provided RapidAPI key is still valid
  4. MCP Connection Issues: Ensure both server and client are using the same transport method
  5. LLM Response Issues: Verify Mistral AI API connectivity and sufficient API credits

Checking Logs

The system uses Python logging. Check console output for detailed error messages. You can adjust log level in .env:

LOG_LEVEL=DEBUG  # For more detailed logs

Testing Individual Components

Test each tool separately:

# Test LLM-powered sentiment analysis
from tools.sentiment_tool import SentimentTool
import asyncio

async def test_sentiment():
    tool = SentimentTool()
    result = await tool.analyze_sentiment(
        "This new AI technology is amazing but also quite expensive",
        "detailed"
    )
    print(result)

asyncio.run(test_sentiment())

šŸ“š Dependencies

Core MCP Dependencies

  • mcp>=1.2.0 - Model Context Protocol implementation
  • httpx>=0.25.0 - HTTP client for API requests
  • python-dotenv>=1.0.0 - Environment variable management

AI/ML Dependencies

  • mistralai>=1.0.0 - Mistral AI client for both summarization and sentiment analysis
  • langchain-mistralai>=0.1.0 - LangChain integration for enhanced LLM capabilities
  • fastmcp>=2.11.0 - FastMCP framework for efficient MCP implementation

Utility Dependencies

  • requests>=2.31.0 - HTTP requests
  • pydantic>=2.0.0 - Data validation
  • rich>=13.0.0 - Pretty terminal output

šŸ¤ Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Test thoroughly
  5. Submit a pull request

šŸ“„ License

This project is open source. Feel free to modify and distribute according to your needs.

šŸ†˜ Support

If you encounter issues:

  1. Check the troubleshooting section
  2. Review logs for error messages
  3. Verify API keys and configuration
  4. Test individual components

For additional help, review the MCP documentation at Model Context Protocol.

76537c4 (initial commit)

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured