iztro OpenAI MCP for Railway
Remote MCP server for Zi Wei Dou Shu / Tu Vi Dau So astrology tools, exposing read-only tools like get_astrolabe, get_horoscope_decades, etc., designed for Railway deployment and OpenAI ChatGPT developer mode.
README
iztro OpenAI MCP for Railway
Remote MCP server for Zi Wei Dou Shu / Tu Vi Dau So tools, prepared for Railway and OpenAI ChatGPT developer mode.
What I found
- As of 2026-04-24,
@feida/iztro-mcp-serveris not returned by npm registry. - The active public package is
@xzkcz/iztro-mcp-server@2.2.1. - npm publish timeline for
@xzkcz/iztro-mcp-server:1.0.3on 2025-07-261.0.4on 2025-10-032.2.1on 2025-12-03
- The package metadata still points to
https://github.com/xzkcz/iztro-mcp-server.git, but that GitHub repository returned404during verification on 2026-04-24. - The published package itself is small and lightweight, but its entrypoint starts with
transportType: "stdio", so it is not directly usable as a public remote MCP endpoint for OpenAI without a wrapper.
Why this wrapper exists
- OpenAI remote MCP works over Streamable HTTP or HTTP/SSE.
- The upstream npm package is
stdioonly. - This project recreates the useful read-only astrology tools and exposes them at:
GET /GET /healthPOST /mcpand related Streamable HTTP trafficGET /sse
Included tools
get_astrolabeget_horoscope_decadesget_horoscope_agesget_horoscope_yearsget_horoscope_monthsget_mutaged_places
gen_astrolabe was intentionally left out because it writes files to disk and is not useful for a public read-only MCP deployment.
Surface Pro 4 fit check
- Resource-wise, this workload is light enough for a Surface Pro 4 with Core m3, 4 GB RAM, and 128 GB storage.
- Published package sizes checked from npm:
@xzkcz/iztro-mcp-server: 67,010 bytes unpackediztro: 2,138,358 bytes unpackedfastmcp: 1,332,900 bytes unpackedlunar-typescript: 1,360,545 bytes unpacked
- Practical issue: to let ChatGPT call it, you need a public HTTPS endpoint. Because this machine did not already have Node installed and local hosting would still need tunnel or public ingress, Railway is the cleaner deployment target.
Deploy to Railway
- Put this folder in a Git repository.
- Push it to GitHub.
- In Railway, create a new project from that repo.
- Railway should auto-detect Node and run
npm start. - After deploy, your MCP endpoint will be:
https://YOUR-APP.up.railway.app/mcp
Health check:
https://YOUR-APP.up.railway.app/health
Landing page:
https://YOUR-APP.up.railway.app/
Connect from ChatGPT
As verified from OpenAI docs on 2026-04-24:
- ChatGPT Developer mode supports remote MCP over
SSEandstreaming HTTP. - Supported auth modes there are
OAuth,No Authentication, andMixed Authentication.
Recommended setup:
- Open ChatGPT on web.
- Go to
Settings -> Apps -> Advanced settings -> Developer mode. - Turn Developer mode on.
- Click
Create app. - Use:
- Server URL:
https://YOUR-APP.up.railway.app/mcp - Authentication:
No Authentication
- Server URL:
OpenAI Responses API example
{
"model": "gpt-5.4",
"input": "Lap la so cho ngay 2000-08-16 luc 2 gio sang, nam, locale zh-CN",
"tools": [
{
"type": "mcp",
"server_label": "iztro",
"server_url": "https://YOUR-APP.up.railway.app/mcp",
"require_approval": "never"
}
]
}
Because every exposed tool here is read-only, require_approval: "never" is a reasonable default.
Run locally later if you want
npm install
npm start
Local endpoints:
http://localhost:3000/mcphttp://localhost:3000/ssehttp://localhost:3000/health
If you want ChatGPT on the public Internet to call a local machine, you still need a public HTTPS tunnel such as Cloudflare Tunnel or another ingress layer.
Windows one-click local hosting
This repo now includes helper scripts for Windows:
powershell -ExecutionPolicy Bypass -File .\start-local-public.ps1
That command:
- starts the MCP server on local port
3000 - tries a public HTTPS tunnel with Cloudflare Quick Tunnel first
- falls back to
localhost.runif Cloudflare is unavailable - prints the public MCP URL you can paste into ChatGPT
When using the local public tunnel, connect ChatGPT to the /mcp URL, not /sse.
This project is intended to be used over Streamable HTTP for OpenAI MCP clients.
To stop everything:
powershell -ExecutionPolicy Bypass -File .\stop-local-public.ps1
The currently active public MCP URL is also written to:
.\.runtime\public-url.txt
If the startup window does not show the URL clearly, you can recover it with:
powershell -ExecutionPolicy Bypass -File .\get-public-url.ps1
Create a clean Railway zip
To generate a clean deployment bundle without node_modules or runtime logs:
powershell -ExecutionPolicy Bypass -File .\pack-railway.ps1
That creates:
..\iztro-openai-mcp-railway-ready.zip
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.