nodusai-mcp-server

nodusai-mcp-server

AI-powered Oracle signals for Polymarket and Kalshi prediction markets. Pay $1 USDC on nodusai.app → get a session token → query from any MCP agent.

Category
Visit Server

README

NodusAI MCP Server

NodusAI MCP Server

AI-Powered Signals for Prediction Markets — accessible to any AI agent via MCP.

AI agents connect to this server to get Oracle signals for Polymarket and Kalshi prediction markets. Signals are generated by Gemini 2.5 Flash with real-time web grounding.

<a href="https://glama.ai/mcp/servers/NodusAI-Your-Prediction-Broker/nodusai-mcp-server"> <img width="380" height="200" src="https://glama.ai/mcp/servers/NodusAI-Your-Prediction-Broker/nodusai-mcp-server/badge" alt="nodusai-mcp-server MCP server" /> </a>


How it works

Agent → nodusai.app → connect wallet → pay $1 USDC → get session token
                                                              ↓
Agent → MCP Server (nodus_get_signal) → nodusai.app/api/prediction → signal
  1. Visit nodusai.app and connect your wallet
  2. Paste a Polymarket or Kalshi market URL
  3. (Optional) Add your desired outcome (YES / NO)
  4. Pay $1 USDC — confirmed on-chain
  5. Get a session token good for 3 queries
  6. Use the session token with nodus_get_signal in any MCP client

Payment model

  • Cost: $1 USDC = 3 Oracle signal queries
  • Networks: Base, Ethereum, Avalanche (any EVM chain)
  • Token: USDC
  • Non-custodial: payments go directly on-chain via nodusai.app
  • Session: one payment = one session token = 3 queries (24h validity)

Available tools

Tool Description
nodus_pricing View pricing and how to get a session token
nodus_get_signal Get an Oracle signal using your session token
nodus_verify_signal Audit grounding sources of a past signal
nodus_query_history Your recent query history
nodus_admin_stats Platform-wide stats (admin)
nodus_admin_queries Full query registry dump (admin)

Signal format

Every Oracle response follows NodusAI's structured schema:

{
  "market_name": "Will the Fed cut rates in June 2026?",
  "predicted_outcome": "YES",
  "probability": 0.73,
  "confidence_score": "HIGH",
  "key_reasoning": "Recent FOMC minutes and inflation data suggest...",
  "grounding_sources": [
    { "title": "Reuters: Fed signals rate path", "url": "https://..." },
    { "title": "AP: CPI data June 2026", "url": "https://..." }
  ]
}

Deploy in 5 minutes

Option 1 — Railway (recommended)

  1. Fork this repo on GitHub
  2. Go to railway.appNew ProjectDeploy from GitHub repo
  3. Select your fork
  4. Add environment variable: NODUSAI_API_BASE = https://nodusai.app
  5. Railway auto-detects railway.json and deploys
  6. Copy your Railway URL

Option 2 — Render (free tier)

  1. Fork this repo
  2. Go to render.comNew Web Service → connect your fork
  3. Set Build command: npm install and Start command: node src/server-http.js
  4. Add env var: NODUSAI_API_BASE=https://nodusai.app

Option 3 — Fly.io

fly launch --name nodusai-mcp
fly secrets set NODUSAI_API_BASE=https://nodusai.app
fly deploy

Connect AI agents

Claude Desktop

File: ~/Library/Application Support/Claude/claude_desktop_config.json

{
  "mcpServers": {
    "nodusai": {
      "url": "https://nodusai-mcp-production.up.railway.app/sse"
    }
  }
}

Cursor

File: ~/.cursor/mcp.json

{
  "mcpServers": {
    "nodusai": {
      "url": "https://nodusai-mcp-production.up.railway.app/sse",
      "transport": "sse"
    }
  }
}

Windsurf

File: ~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "nodusai": {
      "serverUrl": "https://nodusai-mcp-production.up.railway.app/sse"
    }
  }
}

Claude Code (CLI)

claude mcp add --transport sse nodusai https://nodusai-mcp-production.up.railway.app/sse

Custom JS agent

import { Client } from "@modelcontextprotocol/sdk/client/index.js";
import { SSEClientTransport } from "@modelcontextprotocol/sdk/client/sse.js";

const client = new Client({ name: "my-agent", version: "1.0.0" }, { capabilities: {} });
await client.connect(new SSEClientTransport(new URL("https://nodusai-mcp-production.up.railway.app/sse")));

// Step 1 — get a session token at https://nodusai.app ($1 USDC)

// Step 2 — query the Oracle
const result = await client.callTool({
  name: "nodus_get_signal",
  arguments: {
    marketUrl:      "https://polymarket.com/event/...",
    sessionToken:   "your-session-token-from-nodusai.app",
    desiredOutcome: "YES", // optional
  }
});

Custom Python agent

from mcp.client.sse import sse_client
from mcp import ClientSession

async with sse_client("https://nodusai-mcp-production.up.railway.app/sse") as (read, write):
    async with ClientSession(read, write) as session:
        await session.initialize()

        # Get a session token at https://nodusai.app first ($1 USDC)
        result = await session.call_tool("nodus_get_signal", {
            "marketUrl":      "https://kalshi.com/markets/...",
            "sessionToken":   "your-session-token-from-nodusai.app",
            "desiredOutcome": "YES",  # optional
        })

Local development

git clone https://github.com/NodusAI-Your-Prediction-Broker/nodusai-mcp
cd nodusai-mcp
npm install

# Dev mode (mock oracle — no real API calls needed)
npm run dev:http

Test with:

curl http://localhost:3000/health
curl http://localhost:3000/info

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured