iztro OpenAI MCP for Railway

iztro OpenAI MCP for Railway

Remote MCP server for Zi Wei Dou Shu / Tu Vi Dau So astrology tools, exposing read-only tools like get_astrolabe, get_horoscope_decades, etc., designed for Railway deployment and OpenAI ChatGPT developer mode.

Category
Visit Server

README

iztro OpenAI MCP for Railway

Remote MCP server for Zi Wei Dou Shu / Tu Vi Dau So tools, prepared for Railway and OpenAI ChatGPT developer mode.

What I found

  • As of 2026-04-24, @feida/iztro-mcp-server is not returned by npm registry.
  • The active public package is @xzkcz/iztro-mcp-server@2.2.1.
  • npm publish timeline for @xzkcz/iztro-mcp-server:
    • 1.0.3 on 2025-07-26
    • 1.0.4 on 2025-10-03
    • 2.2.1 on 2025-12-03
  • The package metadata still points to https://github.com/xzkcz/iztro-mcp-server.git, but that GitHub repository returned 404 during verification on 2026-04-24.
  • The published package itself is small and lightweight, but its entrypoint starts with transportType: "stdio", so it is not directly usable as a public remote MCP endpoint for OpenAI without a wrapper.

Why this wrapper exists

  • OpenAI remote MCP works over Streamable HTTP or HTTP/SSE.
  • The upstream npm package is stdio only.
  • This project recreates the useful read-only astrology tools and exposes them at:
    • GET /
    • GET /health
    • POST /mcp and related Streamable HTTP traffic
    • GET /sse

Included tools

  • get_astrolabe
  • get_horoscope_decades
  • get_horoscope_ages
  • get_horoscope_years
  • get_horoscope_months
  • get_mutaged_places

gen_astrolabe was intentionally left out because it writes files to disk and is not useful for a public read-only MCP deployment.

Surface Pro 4 fit check

  • Resource-wise, this workload is light enough for a Surface Pro 4 with Core m3, 4 GB RAM, and 128 GB storage.
  • Published package sizes checked from npm:
    • @xzkcz/iztro-mcp-server: 67,010 bytes unpacked
    • iztro: 2,138,358 bytes unpacked
    • fastmcp: 1,332,900 bytes unpacked
    • lunar-typescript: 1,360,545 bytes unpacked
  • Practical issue: to let ChatGPT call it, you need a public HTTPS endpoint. Because this machine did not already have Node installed and local hosting would still need tunnel or public ingress, Railway is the cleaner deployment target.

Deploy to Railway

  1. Put this folder in a Git repository.
  2. Push it to GitHub.
  3. In Railway, create a new project from that repo.
  4. Railway should auto-detect Node and run npm start.
  5. After deploy, your MCP endpoint will be:
https://YOUR-APP.up.railway.app/mcp

Health check:

https://YOUR-APP.up.railway.app/health

Landing page:

https://YOUR-APP.up.railway.app/

Connect from ChatGPT

As verified from OpenAI docs on 2026-04-24:

  • ChatGPT Developer mode supports remote MCP over SSE and streaming HTTP.
  • Supported auth modes there are OAuth, No Authentication, and Mixed Authentication.

Recommended setup:

  1. Open ChatGPT on web.
  2. Go to Settings -> Apps -> Advanced settings -> Developer mode.
  3. Turn Developer mode on.
  4. Click Create app.
  5. Use:
    • Server URL: https://YOUR-APP.up.railway.app/mcp
    • Authentication: No Authentication

OpenAI Responses API example

{
  "model": "gpt-5.4",
  "input": "Lap la so cho ngay 2000-08-16 luc 2 gio sang, nam, locale zh-CN",
  "tools": [
    {
      "type": "mcp",
      "server_label": "iztro",
      "server_url": "https://YOUR-APP.up.railway.app/mcp",
      "require_approval": "never"
    }
  ]
}

Because every exposed tool here is read-only, require_approval: "never" is a reasonable default.

Run locally later if you want

npm install
npm start

Local endpoints:

  • http://localhost:3000/mcp
  • http://localhost:3000/sse
  • http://localhost:3000/health

If you want ChatGPT on the public Internet to call a local machine, you still need a public HTTPS tunnel such as Cloudflare Tunnel or another ingress layer.

Windows one-click local hosting

This repo now includes helper scripts for Windows:

powershell -ExecutionPolicy Bypass -File .\start-local-public.ps1

That command:

  • starts the MCP server on local port 3000
  • tries a public HTTPS tunnel with Cloudflare Quick Tunnel first
  • falls back to localhost.run if Cloudflare is unavailable
  • prints the public MCP URL you can paste into ChatGPT

When using the local public tunnel, connect ChatGPT to the /mcp URL, not /sse. This project is intended to be used over Streamable HTTP for OpenAI MCP clients.

To stop everything:

powershell -ExecutionPolicy Bypass -File .\stop-local-public.ps1

The currently active public MCP URL is also written to:

.\.runtime\public-url.txt

If the startup window does not show the URL clearly, you can recover it with:

powershell -ExecutionPolicy Bypass -File .\get-public-url.ps1

Create a clean Railway zip

To generate a clean deployment bundle without node_modules or runtime logs:

powershell -ExecutionPolicy Bypass -File .\pack-railway.ps1

That creates:

..\iztro-openai-mcp-railway-ready.zip

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured