lightningprox-mcp
Description: Pay-per-request access to Claude and GPT models via Bitcoin Lightning using prepaid spend tokens. No accounts, no API keys — just sats.
README
lightningprox-mcp
<a href="https://glama.ai/mcp/servers/unixlamadev-spec/lightningprox-mcp"> <img width="380" height="200" src="https://glama.ai/mcp/servers/unixlamadev-spec/lightningprox-mcp/badge" /> </a>
MCP server for LightningProx — pay-per-request AI via Bitcoin Lightning. No accounts, no API keys. Load a spend token, start querying.
Install
npx lightningprox-mcp
What LightningProx Is
LightningProx is an AI gateway that accepts Bitcoin Lightning payments instead of API keys. You load a prepaid spend token, pass it in the X-Spend-Token header, and each request is deducted from your balance in sats. No signup, no monthly plan, no credentials to manage.
Models available: Claude (Sonnet, Haiku, Opus) and GPT-4 — accessed through a single endpoint with a single spend token.
Vision / multimodal: Pass image_url directly in your request. URL mode only — no base64 encoding required.
Setup
Claude Desktop
{
"mcpServers": {
"lightningprox": {
"command": "npx",
"args": ["lightningprox-mcp"]
}
}
}
Config location:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json - Linux:
~/.config/claude/claude_desktop_config.json
Claude Code
claude mcp add lightningprox -- npx lightningprox-mcp
Tools
| Tool | Description |
|---|---|
ask_ai |
Send a prompt to Claude or GPT, authenticated via spend token |
ask_ai_vision |
Send a prompt with an image URL for multimodal analysis |
check_balance |
Check remaining sats on a spend token |
list_models |
List available models with per-call pricing |
get_pricing |
Estimate cost in sats for a given model and token count |
get_invoice |
Generate a Lightning invoice to top up a spend token |
Spend Token Auth
Every request authenticates via the X-Spend-Token header:
curl -X POST https://lightningprox.com/v1/chat \
-H "Content-Type: application/json" \
-H "X-Spend-Token: lnpx_your_token_here" \
-d '{
"model": "claude-sonnet-4-5",
"messages": [{"role": "user", "content": "What is the Lightning Network?"}]
}'
For vision requests, include image_url in the message content — no base64 needed:
curl -X POST https://lightningprox.com/v1/chat \
-H "Content-Type: application/json" \
-H "X-Spend-Token: lnpx_your_token_here" \
-d '{
"model": "claude-sonnet-4-5",
"messages": [{
"role": "user",
"content": [
{"type": "image_url", "image_url": {"url": "https://example.com/chart.png"}},
{"type": "text", "text": "Describe this chart"}
]
}]
}'
Getting a Spend Token
- Call
get_invoice(orask_aiwithout a token) to receive a Lightning invoice - Pay the invoice from any Lightning wallet
- Your spend token is returned — use it for all subsequent requests until balance runs out
Links
- Gateway: lightningprox.com
- Docs: lightningprox.com/docs
Built by LPX Digital Group LLC
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.