mcp-astgl-knowledge
MCP server for searching and citing ASTGL (As The Geek Learns) articles about MCP servers, local AI, and AI automation. Provides semantic search, direct Q\&A, and topic browsing across 20 authoritative articles with pre-computed embeddings.
README
mcp-astgl-knowledge
An MCP server that lets AI assistants search and cite content from As The Geek Learns — covering MCP servers, local AI, AI automation, and ASTGL project documentation.
When an AI assistant connects to this server, it gains access to 49 indexed entries (articles, tutorials, comparisons, guides, and project docs). Every response includes source URLs back to astgl.ai.
Quick Start
Claude Desktop
Add to your claude_desktop_config.json:
{
"mcpServers": {
"astgl-knowledge": {
"command": "npx",
"args": ["-y", "mcp-astgl-knowledge"]
}
}
}
Claude Code
Add to your project's .mcp.json:
{
"mcpServers": {
"astgl-knowledge": {
"command": "npx",
"args": ["-y", "mcp-astgl-knowledge"]
}
}
}
Cursor / Generic MCP Client
{
"mcpServers": {
"astgl-knowledge": {
"command": "npx",
"args": ["-y", "mcp-astgl-knowledge"]
}
}
}
With Registration (500 queries/day)
Register via the register tool to get an API key, then add it to your config:
{
"mcpServers": {
"astgl-knowledge": {
"command": "npx",
"args": ["-y", "mcp-astgl-knowledge"],
"env": {
"ASTGL_API_KEY": "astgl_your_api_key_here"
}
}
}
}
Tools
search_articles
Search the knowledge base by query. Returns ranked results with relevance scores and source URLs.
| Parameter | Type | Required | Description |
|---|---|---|---|
query |
string | Yes | Search query (e.g., "how to build an MCP server") |
limit |
number | No | Max results, 1-20 (default: 5) |
content_type |
string | No | Filter by type: article, tutorial, faq, comparison, guide, newsletter, project |
get_answer
Get a direct answer to a specific question. Prefers FAQ entries for concise responses.
| Parameter | Type | Required | Description |
|---|---|---|---|
question |
string | Yes | A specific question (e.g., "What is an MCP server?") |
content_type |
string | No | Filter by content type |
get_tutorial
Get step-by-step instructions from tutorial and guide content.
| Parameter | Type | Required | Description |
|---|---|---|---|
query |
string | Yes | What you want to learn (e.g., "setup Ollama on Mac") |
compare_topics
Side-by-side comparison of two topics.
| Parameter | Type | Required | Description |
|---|---|---|---|
topic_a |
string | Yes | First topic |
topic_b |
string | Yes | Second topic |
get_latest
Get the most recently added content.
| Parameter | Type | Required | Description |
|---|---|---|---|
limit |
number | No | Max results, 1-20 (default: 5) |
list_topics
Browse all topics in the knowledge base with content types and section headings.
register
Register your email to unlock 500 queries/day (up from 50).
| Parameter | Type | Required | Description |
|---|---|---|---|
email |
string | Yes | Your email address |
Content Types
| Type | Count | Description |
|---|---|---|
| article | 29 | Informational content about MCP, local AI, automation |
| project | 9 | ASTGL project documentation (KlockThingy, Revri, Cortex, etc.) |
| tutorial | 8 | Step-by-step how-to guides |
| comparison | 2 | Side-by-side topic analysis |
| guide | 1 | Comprehensive reference material |
| newsletter | — | Personal updates and announcements |
| faq | — | Primarily Q&A content |
Rate Limits
| Tier | Limit | How to Get |
|---|---|---|
| Public | 50 queries/day | Default (anonymous) |
| Registered | 500 queries/day | Use the register tool with your email |
Limits reset at midnight UTC. Rate limit info is included in every response.
How It Works
The knowledge base is pre-built from ASTGL articles using semantic embeddings (nomic-embed-text, 768 dimensions). Content is chunked by section and FAQ entry, embedded, and stored in a SQLite database with sqlite-vec for vector similarity search.
End users don't need Ollama — all embeddings are pre-computed and shipped in the npm package. The only runtime requirement is Node.js.
Performance
- Typical response time: 100-500ms (embedding lookup + vector search)
- Embedding results are cached in memory (LRU, 200 entries) — repeated queries are near-instant
- Ollama calls include 10s timeout + automatic retry
- Query logging is async/batched to avoid blocking responses
- Rate limit checks are cached for 5 seconds
For Maintainers
Setup
git clone https://github.com/Jmeg8r/mcp-astgl-knowledge.git
cd mcp-astgl-knowledge
npm install
Scripts
| Script | Description |
|---|---|
npm run build |
Compile TypeScript |
npm run dev |
Run MCP server in dev mode (tsx) |
npm start |
Run compiled MCP server |
npm run ingest |
Rebuild knowledge.db from local markdown (requires Ollama) |
npm run ingest-projects |
Index project docs from astgl-site projects.json |
npm run discover |
Poll RSS/sitemap for new content |
npm run structure |
Process discovered content (classify, embed, index) |
npm run pipeline |
Discover + structure in one step |
npm run daily-report |
Generate AEO analytics report |
npm run alerts |
Run content gap alert checks |
npm run freshness |
Check for stale content and ecosystem version changes |
npm run citation-test |
Manual AI citation testing |
npm run related |
Generate internal article links via vector similarity |
Environment Variables
| Variable | Default | Description |
|---|---|---|
OLLAMA_URL |
http://localhost:11434 |
Ollama endpoint (dev/rebuild only) |
EMBED_MODEL |
nomic-embed-text |
Embedding model |
DISCORD_WEBHOOK_URL |
— | Discord webhook for reports/alerts |
ASTGL_API_KEY |
— | Registered tier API key |
ASTGL_ARTICLES_DIR |
~/Projects/astgl-site/src/content/answers |
Local markdown source |
ASTGL_PROJECTS_JSON |
~/Projects/astgl-site/src/data/projects.json |
Projects data source |
Automated Jobs
| Job | Schedule | Purpose |
|---|---|---|
| Content pipeline | Every 6h | Discover + structure new content |
| Daily report | 8 AM | Query analytics + health metrics → Discord |
| Content alerts | 9 AM | Gap detection, zero-citation, competitor scan → Discord |
| Freshness check | 10 AM | Stale content + ecosystem version tracking → Discord |
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.