MyCareersFuture MCP Server
Enables users to search for job opportunities in Singapore through the MyCareersFuture public API. Provides live job search dashboards with rich UI components for exploring software engineering and other job openings.
README
MyCareersFuture MCP Server Demo
This repository hosts a proof-of-concept Model Context Protocol (MCP) server that wraps the public MyCareersFuture (MCF) job search API. It illustrates how to render this in ChatGPT with the Apps SDK.

MCP + MyCareersFuture overview
The MCP server accepts structured tool requests, forwards them to the MCF API, and returns:
- Structured JSON describing the queried job listings (titles, companies, salary hints, metadata).
- An
_meta.openai/outputTemplatepointer to a static widget bundle so compatible clients (such as ChatGPT with the Apps SDK) can render an interactive carousel.
Each call is validated with Pydantic models, logged for observability, and kept intentionally simple to serve as an approachable starting point for your own integrations. For a deeper breakdown of the server implementation, see MCP_SERVER.md.
Repository structure
mycareersfuture_server_python/– FastAPI/uvicorn MCP server sourcing jobs from the MCF API.src/– Widget source code (MyCareersFuture carousel and Todo example) used when building UI assets.assets/– Generated HTML/JS/CSS bundles created during the build step.build-all.mts– Vite build orchestration script that packages per-widget assets.
Prerequisites
- Node.js 18+
- pnpm (recommended) or npm/yarn
- Python 3.10+
Install dependencies
Clone the repository and install JavaScript dependencies for the widget build:
pnpm install
If you prefer npm or yarn, install the root dependencies with your client of choice and adjust the commands below accordingly.
Build widget assets
The MCP server serves static bundles that power the MyCareersFuture widget. Build them before running the server:
pnpm run build
This executes build-all.mts, producing versioned .html, .js, and .css files in assets/. Each widget includes its required CSS so you can host or distribute the bundles independently.
To iterate locally, use the Vite dev server:
pnpm run dev
If you want to preview the generated bundles without the MCP server, run the static file server after building:
pnpm run serve
This exposes the compiled assets at http://localhost:4444 with CORS enabled for local tooling.
Run the MCP server
Create a virtual environment, install the Python dependencies, and start the FastAPI server:
python -m venv .venv
source .venv/bin/activate
pip install -r mycareersfuture_server_python/requirements.txt
uvicorn mycareersfuture_server_python.main:app --port 8000
The server listens for standard MCP requests over HTTP/SSE and exposes a single tool, mycf-job-list, which queries live jobs and returns structured results with widget metadata.
Apps SDK integration (optional)
This project also demonstrates how the MyCareersFuture MCP responses can light up a front-end experience in ChatGPT. When the _meta.openai/outputTemplate field references the bundled widget, the Apps SDK renders:
- A horizontal carousel of job cards with titles, employers, salary hints, and metadata.
- Inline navigation controls for exploring multiple roles.
Using the Apps SDK is optional; MCP-compatible clients can consume the structured JSON without rendering the widget.
Test in ChatGPT
Enable developer mode in ChatGPT, add the MCP server as a connector, and (if necessary) expose the local instance using a tunneling tool such as ngrok:
ngrok http 8000
Add the connector URL (for example, https://<custom_endpoint>.ngrok-free.app/mcp), enable the connector in a conversation, and ask questions like “Find software engineering openings in Singapore.” ChatGPT will call mycf-job-list, receive structured job data, and—when using the Apps SDK—render the bundled widget.
Next steps
- Customize the MCP handler in
mycareersfuture_server_python/main.pyto call additional APIs or enforce business rules. - Add new widgets under
src/and extend the build script to package them. - Harden the server for production (authentication, caching, rate limiting) before deploying to user-facing environments.
Contributing
Contributions are welcome, but please note that we may not be able to review every suggestion.
License
This project is licensed under the MIT License. See LICENSE for details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.