GEO Analysis for AI SEO
GEO (Generative Engine Optimisation). This tool shows you exactly how AI search engines see your content - claim density, writing quality, E-E-A-T signals, extractability. Research-backed metrics that correlate with 40% higher AI citation rates.
README
GEO Analyzer
Content analysis for AI search visibility. Measures what actually matters for getting cited by ChatGPT, Claude, Perplexity, and Google AI Overviews.
<p align="center"> <a href="https://glama.ai/mcp/servers/@houtini-ai/geo-analyzer"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@houtini-ai/geo-analyzer/badge" alt="GEO Analyzer MCP server" /> </a> </p>
What It Does
GEO Analyzer examines content for the signals AI systems use when selecting sources to cite:
- Claim Density - Extractable facts per 100 words
- Information Density - Word count vs predicted AI coverage
- Answer Frontloading - How quickly key information appears
- Semantic Triples - Structured (subject, predicate, object) relationships
- Entity Recognition - Named entities AI can reference
- Sentence Structure - Optimal length for AI parsing
The analysis runs locally using Claude Sonnet 4.5 for semantic extraction. No external services, no data leaving your machine.
Installation
Claude Desktop
Add to your claude_desktop_config.json:
{
"mcpServers": {
"geo-analyzer": {
"command": "npx",
"args": ["-y", "@houtini/geo-analyzer@latest"],
"env": {
"ANTHROPIC_API_KEY": "sk-ant-..."
}
}
}
}
Config locations:
- Windows:
%APPDATA%\Claude\claude_desktop_config.json - macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
Restart Claude Desktop after saving.
Requirements
- Node.js 20+
- Anthropic API key (console.anthropic.com)
Usage Examples
Analyse a Published URL
Analyse https://example.com/article for "topic keywords"
The topic context helps score relevance but isn't required:
Analyse https://example.com/article
Analyse Text Directly
Paste content for analysis (minimum 500 characters):
Analyse this content for "sim racing wheels":
[Your content here]
Summary Mode
Get condensed output without detailed recommendations:
Analyse https://example.com/article with output_format=summary
Output
Scores (0-10)
| Score | Measures |
|---|---|
| Overall | Weighted average of all factors |
| Extractability | How easily AI can extract facts |
| Readability | Structure quality for AI parsing |
| Citability | How quotable and attributable |
Key Metrics
Information Density:
- Word count with coverage prediction
- Optimal range: 800-1,500 words
- Pages under 1K words: ~61% AI coverage
- Pages over 3K words: ~13% AI coverage
Answer Frontloading:
- Claims and entities in first 100/300 words
- First claim position
- Score indicating answer immediacy
Claim Density:
- Target: 4+ claims per 100 words
- Extractable facts, statistics, measurements
Sentence Length:
- Target: 15-20 words average
- Matches Google's ~15.5 word chunk extraction
Recommendations
Prioritised suggestions with:
- Specific locations in content
- Before/after examples
- Rationale based on research
Tools
analyze_url
Fetches and analyses published web pages.
| Parameter | Required | Description |
|---|---|---|
url |
Yes | URL to analyse |
query |
No | Topic context for relevance scoring |
output_format |
No | detailed (default) or summary |
analyze_text
Analyses pasted content directly.
| Parameter | Required | Description |
|---|---|---|
content |
Yes | Text to analyse (min 500 chars) |
query |
No | Topic context for relevance scoring |
output_format |
No | detailed (default) or summary |
Troubleshooting
"ANTHROPIC_API_KEY is required"
Add your API key to the env section in config.
"Cannot find module" after config change Restart Claude Desktop completely.
"Content too short" Minimum 500 characters required for meaningful analysis.
Paywalled content returns errors The analyser can only access publicly available pages.
Performance
- URL analysis: ~8-10 seconds
- Text analysis: ~5-7 seconds
- Cost: ~$0.14 per analysis (Sonnet 4.5)
Migration from v1.x
v2.0 removed external dependencies. Update your config:
Old (v1.x):
{
"env": {
"GEO_WORKER_URL": "https://...",
"JINA_API_KEY": "jina_..."
}
}
New (v2.x):
{
"env": {
"ANTHROPIC_API_KEY": "sk-ant-..."
}
}
Development
git clone https://github.com/houtini-ai/geo-analyzer.git
cd geo-analyzer
npm install
npm run build
Research Foundation
The analysis methodology draws from peer-reviewed research and empirical studies:
MIT GEO Paper (2024)
Aggarwal et al., "GEO: Generative Engine Optimization" - ACM SIGKDD
Key findings applied:
- Claim density target of 4+ per 100 words
- Optimal sentence length of 15-20 words
- 40% improvement in AI citation rates with extractability focus
Dejan AI Grounding Research (2025)
Empirical analysis of 7,060 queries and 2,275 pages
Key findings applied:
- ~2,000 word total grounding budget per query
- Rank #1 source gets 531 words (28% of budget)
- Rank #5 source gets 266 words (13% of budget)
- Average extraction chunk: 15.5 words
- Pages <1K words: 61% coverage
- Pages 3K+ words: 13% coverage
dejan.ai/blog/how-big-are-googles-grounding-chunks
dejan.ai/blog/googles-ranking-signals
MIT License - Houtini.ai
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.