PsyFlow-MCP
A lightweight FastMCP server that enables language models to discover, clone, transform, and localize PsyFlow task templates through a streamlined workflow with standardized tools.
README
PsyFlow‑MCP · Usage Guide
A lightweight FastMCP server that lets a language‑model clone, transform, download and local‑translate PsyFlow task templates using a single entry‑point tool.
1 · Install & Run
# 1. Clone your project repository
git clone https://github.com/TaskBeacon/psyflow-mcp.git
cd psyflow_mcp
# 2. Install runtime deps
pip install "mcp-sdk[fastmcp]" gitpython httpx ruamel.yaml
# 3. Launch the std‑IO server
python main.py
The process stays in the foreground and communicates with the LLM over STDIN/STDOUT via the Model‑Context‑Protocol (MCP).
2 · Conceptual Workflow
- User describes the task they want (e.g. “Make a Stroop out of Flanker”).
- LLM calls the
tool:\ • If the model already knows the best starting template it passes `source_task`.\ • Otherwise it omits `source_task`, receives a menu created by, picks a repo, then calls `` again with that repo. - The server clones the chosen template, returns a Stage 0→5 instruction prompt (``) plus the local template path.
- The LLM edits files locally, optionally invokes `` to localise config.yaml, then zips / commits the new task.
3 · Exposed Tools
| Tool | Arguments | Purpose / Return |
|---|---|---|
build_task |
target_task:str, source_task?:str |
Main entry‑point. • With source_task → clones repo and returns: prompt (Stage 0→5) + template_path (local clone). • Without source_task → returns prompt_messages from `` so the LLM can pick the best starting template, then call build_task again. |
list_tasks |
none | Returns an array of objects: { repo, readme_snippet, branches }, where branches lists up to 20 branch names for that repo. |
download_task |
repo:str |
Clones any template repo from the registry and returns its local path. |
translate_config |
task_path:str, target_language:str |
Reads config.yaml, wraps it in ``, and returns prompt_messages so the LLM can translate YAML fields in‑place. |
Why a single entry‑point?
build_taskalready covers both “discover a template” and “explicitly transform template X into Y”. Separatetransform_taskbecame redundant, so it has been removed.
4 · Exposed Prompts
| Prompt | Parameters | Description |
|---|---|---|
transform_prompt |
source_task, target_task |
Single User message containing the full Stage 0→5 instructions to convert source_task into target_task. |
choose_template_prompt |
desc, candidates:list[{repo,readme_snippet}] |
Three User messages: task description, template list, and selection criteria. The LLM must reply with one repo name or the literal word NONE. |
translate_config_prompt |
yaml_text, target_language |
Two‑message sequence: strict translation instruction + raw YAML. The LLM must return the fully‑translated YAML body with formatting preserved and no commentary. |
5 · Typical Call‑and‑Response
5.1 – Template Discovery
{ "tool": "build_task",
"arguments": { "target_task": "Stroop" }
}
Server → returns prompt_messages .
5.2 – LLM Chooses Template & Requests Build
{ "tool": "build_task",
"arguments": { "target_task": "Stroop",
"source_task": "Flanker" }
}
Server → returns Stage 0→5 prompt + template_path (cloned Flanker repo).
5.3 – Translating YAML (Optional)
{ "tool": "translate_config",
"arguments": { "task_path": "/abs/path/Flanker",
"target_language": "zh" }
}
Server → returns prompt_messages; LLM translates YAML and writes it back.
6 · Template Folder Layout
<repo>/
├─ config/
│ └─ config.yaml
├─ main.py
├─ src/
│ └─ run_trial.py
└─ README.md
Stage 0→5 assumes this structure.
Adjust NON_TASK_REPOS, network timeouts, or git clone depth to match your infrastructure.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.