Codex Gemini MCP

Codex Gemini MCP

A TypeScript-based MCP server that provides tools to interact with local Codex and Gemini CLIs via stdio transport. It enables users to execute prompts through the ask_codex and ask_gemini tools, supporting custom models and timeout configurations.

Category
Visit Server

README

codex-gemini-mcp-sample

간결한 TypeScript MCP 서버 예제입니다.

  • MCP tool ask_codex: 로컬 codex CLI에 프롬프트 전달
  • MCP tool ask_gemini: 로컬 gemini CLI에 프롬프트 전달
  • MCP background tools: wait_for_job, check_job_status, kill_job, list_jobs
  • stdio transport 기반으로 동작
  • 현재 Phase E 핵심 안정화 반영 (output cap + model validation 포함)

MCP_REVERSE_ENGINEERING.md는 참고용이며, 이 샘플은 의도적으로 기능을 최소화했습니다.

Requirements

  • Node.js 20+
  • codex CLI 설치 (npm i -g @openai/codex)
  • gemini CLI 설치 (npm i -g @google/gemini-cli)

Quick Start

npm install
npm run build
npm run start:codex
npm run start:gemini

개발 모드:

npm run dev:codex
npm run dev:gemini

Example .mcp.json

{
  "mcpServers": {
    "codex-mcp": {
      "command": "node",
      "args": ["/absolute/path/to/codex-gemini-mcp/dist/mcp/codex-stdio-entry.js"]
    },
    "gemini-mcp": {
      "command": "node",
      "args": ["/absolute/path/to/codex-gemini-mcp/dist/mcp/gemini-stdio-entry.js"]
    }
  }
}

Tool Schemas

ask_codex

  • prompt (string, required)
  • model (string, optional)
  • timeout_ms (number, optional, default 600000)
    • 300000 미만이면 에러 반환 + 재시도 가이드 제공
    • long-running task는 1800000(30분) 권장
  • model[A-Za-z0-9][A-Za-z0-9._:-]* 패턴(최대 128자)만 허용
  • working_directory (string, optional)
  • background (boolean, optional, default true)
  • reasoning_effort (string, optional: minimal | low | medium | high | xhigh)

ask_gemini

  • prompt (string, required)
  • model (string, optional)
  • timeout_ms (number, optional, default 600000)
    • 300000 미만이면 에러 반환 + 재시도 가이드 제공
    • long-running task는 1800000(30분) 권장
  • model[A-Za-z0-9][A-Za-z0-9._:-]* 패턴(최대 128자)만 허용
  • working_directory (string, optional)
  • background (boolean, optional, default true)

wait_for_job

  • job_id (string, required, 8자리 hex)
  • timeout_ms (number, optional, default 3600000, max 3600000)

check_job_status

  • job_id (string, required, 8자리 hex)

kill_job

  • job_id (string, required, 8자리 hex)
  • signal (string, optional: SIGTERM | SIGINT, default SIGTERM)

list_jobs

  • status_filter (string, optional: active | completed | failed | all, default active)
  • limit (number, optional, default 50)

Runtime Notes

  • ask_codex: codex exec --ephemeral 호출 (reasoning_effort 지정 시 -c model_reasoning_effort=... 추가)
  • ask_gemini: gemini --prompt <text> 호출
  • ask_*background 미지정 시 기본 true로 실행
  • background: true 호출 시 .codex-gemini-mcp/jobs, .codex-gemini-mcp/prompts에 상태/입출력(content) 파일 저장
  • 구조화 로깅(JSONL): .codex-gemini-mcp/logs/mcp-YYYY-MM-DD.jsonl
    • 기본: 메타데이터만 저장 (본문 미저장)
    • MCP_LOG_PREVIEW=1: preview 저장
    • MCP_LOG_FULL_TEXT=1: full text 저장
    • 로그 이벤트는 JSONL 파일 저장과 함께 stderr에도 미러링됨
  • 모델 선택 우선순위: request.model > env default > hardcoded default
    • codex env: MCP_CODEX_DEFAULT_MODEL (기본값: gpt-5.3-codex)
    • gemini env: MCP_GEMINI_DEFAULT_MODEL (기본값: gemini-3-pro-preview)
  • timeout_ms 미지정 시 기본값은 MCP_CLI_TIMEOUT_MS 또는 600000ms(10분)
  • timeout_ms < 300000 요청은 거부되며, 재시도 가이드가 반환됨
  • stdout + stderr 합산 출력이 MCP_MAX_OUTPUT_BYTES를 넘으면 CLI_OUTPUT_LIMIT_EXCEEDED로 종료

Logging by background

  • 공통(background true/false 모두): JSONL에 request 이벤트와 terminal(response 또는 error) 이벤트가 기록되고, request_id로 1차 추적 가능
  • background: false (foreground): 로그 이벤트에 job_id가 없음. jobs/, prompts/ 파일은 생성되지 않음
  • background: true (background):
    • MCP 응답에 jobId, contentFile, statusFile 반환
    • JSONL response/error 이벤트에 job_id 기록
    • jobs/*status*.json, prompts/*content*.jsonrequestId 저장
    • 따라서 request_id <-> job_id를 로그/상태파일 양방향으로 매핑 가능

Environment Variables

  • MCP_CODEX_DEFAULT_MODEL: codex 기본 모델
  • MCP_GEMINI_DEFAULT_MODEL: gemini 기본 모델
  • MCP_CLI_TIMEOUT_MS: 기본 CLI timeout(ms)
  • MCP_MAX_OUTPUT_BYTES: 최대 출력 바이트(cap, 기본 1048576 = 1MiB)
  • MCP_RUNTIME_DIR: 런타임 파일 기본 루트(.codex-gemini-mcp)
  • MCP_LOG_DIR: 로그 경로 override
  • MCP_LOG_PREVIEW: 로그 preview 저장 여부 (1이면 활성화)
  • MCP_LOG_FULL_TEXT: 전체 텍스트 로그 여부 (1이면 활성화)

Current Status

  • MCP 등록 엔트리: dist/mcp/codex-stdio-entry.js, dist/mcp/gemini-stdio-entry.js
  • 검증 완료: ask_codex, ask_gemini foreground/background 실호출 성공
  • 검증 완료: wait_for_job, check_job_status, kill_job, list_jobs 실호출 성공
  • 구현 완료: 구조화 로깅(Phase D)
  • 구현 완료: output cap 강제 + model regex validation

Scope (deliberately minimal)

이 샘플에는 아래 기능이 없습니다:

  • 모델 fallback chain
  • standalone bridge 번들링

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured