sequential-thinking-mcp

sequential-thinking-mcp

Simple sequential thinking MCP in python

Category
Visit Server

Tools

think

Tool for advanced meta-cognition and dynamic reflective problem-solving via thought logging. Supports thread following, step-tracking, self-correction, and tool recommendations. For each new user message, begin a new thought thread and log each thought after each completed step. Key functionalities: - Agentic Workflow Orchestration: Guides through complex tasks by breaking them into precise, manageable, traceable steps. - Automatic smart thinking process: Avoids over-questionning users about their intention and just figures it out how to proceed. - Iterative Refinement: Assesses success of each step and self-corrects if necessary, adapting to new information or errors (failure, empty results, etc). - Tool Recommendation: Suggests relevantly specific available tools (`tool_recommendation`) to execute planned actions or gather necessary information. - Proactive Planning: Utilizes `left_to_be_done` for explicit future state management and task estimation. Args: - `thread_purpose` (str): A concise, high-level objective or thematic identifier for the current thought thread. Essential for organizing complex problem-solving trajectories. - `thought` (str): The detailed, atomic unit of reasoning or action taken by the AI agent at the current step. This forms the core of the agent's internal monologue. - `thought_index` (int): A monotonically increasing integer representing the sequence of thoughts within a specific `thread_purpose`. Crucial for chronological tracking and revision targeting. - `tool_recommendation` (str, optional): A precise actionable suggestion for the next tool to be invoked, omitted if no tool is needed, directly following the current thought. - `left_to_be_done` (str, optional): A flexible forward-looking statement outlining the next steps or sub-goals to be completed within the current `thread_purpose`. Supports multi-step planning and progress tracking. Omitted if no further action is needed. Example of thought process: 1) user: "I keep hearing about central banks, but I don't understand what they are and how they work." 2) think(thread_purpose="Central banks explained", thought="Requires information about central banks and how they work. Consider using <named_tool> tool.", thought_index=1, tool_recommendation="<named_tool>", left_to_be_done="Summarize the findings and create an exhaustive graph representation") 3) call <named_tool> 4) think(thread_purpose="Central banks explained", thought="Summary of the findings is clear and exhaustive, I have enough information. Must create the graph with <named_tool>.", thought_index=2, tool_recommendation="<named_tool>", left_to_be_done="Send summary and graph to the user") 5) call <named_tool> 6) final: respond with summary and graph (no need to call think since left_to_be_done is a simple final step)

README

Sequential Thinking MCP

uv Python PyPI Actions status License: MIT Ask DeepWiki

This repository provides an MCP (Model Context Protocol) server that enables an AI agent to perform advanced meta-cognition and dynamic, reflective problem-solving.

<a href="https://glama.ai/mcp/servers/@philogicae/sequential-thinking-mcp"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@philogicae/sequential-thinking-mcp/badge?cache-control=no-cache" alt="Sequential Thinking MCP" /> </a>

Table of Contents

Features

  • Advanced Meta-Cognition: Provides a think tool for dynamic and reflective problem-solving through thought logging.
  • Agentic Workflow Orchestration: Guides AI agents through complex tasks by breaking them into precise, manageable, and traceable steps.
  • Iterative Refinement: Assesses the success of each step and self-corrects if necessary, adapting to new information or errors.
  • Proactive Planning: Utilizes left_to_be_done for explicit future state management and task estimation.
  • Tool Recommendation: Suggests specific tools to execute planned actions or gather necessary information.

Setup

Prerequisites

  • Python 3.10+
  • uv (for local development)

Installation

Choose one of the following installation methods.

Install from PyPI (Recommended)

This method is best for using the package as a library or running the server without modifying the code.

  1. Install the package from PyPI:
pip install sequential-thinking-mcp
  1. Run the MCP server:
python -m sequential_thinking

For Local Development

This method is for contributors who want to modify the source code. Using uv:

  1. Clone the repository:
git clone https://github.com/philogicae/sequential-thinking-mcp.git
cd sequential-thinking-mcp
  1. Install dependencies using uv:
uv sync
  1. Run the MCP server:
uv run -m sequential_thinking

Usage

As MCP Server

from sequential_thinking import mcp

mcp.run(transport="sse")

Via MCP Clients

Usable with any MCP-compatible client. Available tools:

  • think: Log a thought, plan next steps, and recommend tools.

Example with Windsurf

Configuration:

{
  "mcpServers": {
    ...
    # with stdio (only requires uv)
    "sequential-thinking-mcp": {
      "command": "uvx",
      "args": [ "sequential-thinking-mcp" ]
    },
    # with sse transport (requires installation)
    "sequential-thinking-mcp": {
      "serverUrl": "http://127.0.0.1:8000/sse"
    },
    # with streamable-http transport (requires installation)
    "sequential-thinking-mcp": {
      "serverUrl": "http://127.0.0.1:8000/mcp" # not yet supported by every client
    },
    ...
  }
}

Changelog

See CHANGELOG.md for a history of changes to this project.

Contributing

Contributions are welcome! Please open an issue or submit a pull request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured