SJTU MCP

SJTU MCP

Wraps the SJTU Zhiyuan No.1 API as a local MCP server, allowing direct use of SJTU models in Claude Code and Codex without custom integration scripts.

Category
Visit Server

README

SJTU MCP

English | 中文

Claude Code Codex SJTU API Version License

Turn your SJTU Zhiyuan No.1 API key into something you can actually use in Claude Code and Codex.

SJTU MCP wraps the SJTU-hosted model API as a local MCP server, so you can call these models directly from your normal agent workflow instead of hand-writing integration scripts over and over again.

Why This Exists

Have you already applied for an SJTU Zhiyuan No.1 API key, but still found it hard to actually use in practice?

This project exists to solve exactly that problem:

  • you already have API access
  • you want to use it from Claude Code or Codex
  • but the SJTU endpoint itself does not plug directly into these agent tools out of the box
  • you do not want to rewrite the integration layer every time

Highlights

  • Supports Claude Code
  • Supports Codex
  • Supports both text and vision tasks
  • Uses the SJTU OpenAI-compatible endpoint
  • Fits naturally into existing MCP workflows

Contents

Quick Start

For most users, the simplest path is:

  1. git clone this repo
  2. cd into the project directory
  3. install it once
  4. add it as a global MCP server in Claude Code or Codex
git clone https://github.com/EternalWavee/sjtu-mcp.git
cd sjtu-mcp
pip install -e .

After installation, your MCP client can start the server automatically when needed. In normal use, you do not need to manually run the server command every time.

Environment Variables

Required:

  • SJTU_API_KEY

Optional:

  • SJTU_API_BASE_URL
  • SJTU_DEFAULT_TEXT_MODEL
  • SJTU_DEFAULT_REASONING_MODEL
  • SJTU_DEFAULT_VISION_MODEL
  • SJTU_REQUEST_TIMEOUT

How to use them:

  • .env.example is only a template showing which variables you need
  • in actual use, put these values into the env block of your MCP configuration

Claude Code

Recommended: User Scope

Use this if you want sjtu available in all your Claude Code projects on this machine.

claude mcp add sjtu --scope user -- python -m sjtu_mcp.server

Then:

  1. open ~/.claude.json
  2. find the sjtu entry
  3. copy the env section from examples/claude-project.mcp.json
  4. replace your-api-key with your real key

Verify:

claude mcp list

Project Scope

Use this if you want to commit a shared config into the repo for teammates.

How to use it:

  1. copy examples/claude-project.mcp.json into your project root as .mcp.json
  2. replace your-api-key with your real key
  3. adjust default models and timeout if needed

Windows / macOS example:

{
  "mcpServers": {
    "sjtu": {
      "command": "python",
      "args": ["-m", "sjtu_mcp.server"],
      "env": {
        "SJTU_API_BASE_URL": "https://models.sjtu.edu.cn/api/v1",
        "SJTU_API_KEY": "your-api-key",
        "SJTU_DEFAULT_TEXT_MODEL": "deepseek-chat",
        "SJTU_DEFAULT_REASONING_MODEL": "deepseek-reasoner",
        "SJTU_DEFAULT_VISION_MODEL": "qwen3vl",
        "SJTU_REQUEST_TIMEOUT": "180"
      }
    }
  }
}

Local Scope

Use this if you only want the server for the current project and do not want to commit the config.

claude mcp add sjtu --scope local -- python -m sjtu_mcp.server

Then add the same env values to the corresponding MCP config entry.

Codex

Recommended: Global Setup

Use this if you want sjtu available in all your Codex projects on this machine.

codex mcp add sjtu -- python -m sjtu_mcp.server

Then:

  1. open your own ~/.codex/config.toml
  2. copy the content from examples/codex-config.toml
  3. replace your-api-key with your real key
  4. save and reload Codex or reload MCP

Verify:

codex mcp list

Config File Setup

If you already manage ~/.codex/config.toml directly, you can use this template:

[mcp_servers.sjtu]
command = "python"
args = ["-m", "sjtu_mcp.server"]

[mcp_servers.sjtu.env]
SJTU_API_BASE_URL = "https://models.sjtu.edu.cn/api/v1"
SJTU_API_KEY = "your-api-key"
SJTU_DEFAULT_TEXT_MODEL = "deepseek-chat"
SJTU_DEFAULT_REASONING_MODEL = "deepseek-reasoner"
SJTU_DEFAULT_VISION_MODEL = "qwen3vl"
SJTU_REQUEST_TIMEOUT = "180"

Tools

  • sjtu_models
  • sjtu_text
  • sjtu_vision
  • sjtu_cheap_task

Example

Input

请调用 sjtu_vision 分析图片里面的内容 .assets/test.png

test

Output

answer

Suggested Model Usage

  • deepseek-chat
    • default for summaries, rewrites, cleanup, and low-risk text tasks
  • minimax or glm-5
    • useful for lightweight rewriting, classification, or extraction
  • deepseek-reasoner
    • better for tasks that truly need multi-step reasoning
  • qwen3vl
    • a strong starting point for screenshots, OCR-style extraction, and image understanding
  • qwen3coder
    • useful for code-adjacent utility tasks

Notes

  • This server currently assumes the SJTU endpoint supports OpenAI-compatible /models and /chat/completions.
  • Local images are encoded as data URLs before sending.
  • If your campus endpoint has model-specific quirks, extend the routing in src/sjtu_mcp/server.py.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured