OpenAPITools SDK
Your APIs, Now AI Tools. Build mcp servers in a minute.
kvssankar
README
OpenAPITools SDK
Introduction
OpenAPITools Python package enables developers to manage, and execute tools across multiple AI API providers. It provides a unified interface for working with tools in Anthropic's Claude, OpenAI's GPT models, and LangChain frameworks.
With OpenAPITools, you can:
- Create tools as Python or Bash scripts with standardized input/output
- Access these tools through a single, consistent SDK
- Integrate tools with Claude, GPT, and LangChain models
- Build interactive chatbots that can use tools to solve complex tasks
Installation
Prerequisites
- Python 3.8 or later
- Access keys to at least one of the supported AI providers (Anthropic, OpenAI, or LangChain)
- Get an API key for OpenAPITools from the Settings page
Install from PyPI
pip install reacter-openapitools requests
If you're using the LangChain adapter, you’ll also need to install langchain
and langchain-core
:
pip install langchain langchain-core
Tool Execution Details
Python Tools
- Python tools are executed using Python's
exec()
function directly in the current process - Benefits:
- No interpreter startup overhead
- Full privacy (code runs locally)
- Faster execution compared to subprocess methods
- Python tools receive arguments via an
input_json
dictionary and can access environment variables throughinput_json["openv"]
Bash Tools
- Bash tools are executed as subprocesses
- Arguments are passed as JSON to the script's standard input
- Recommended for non-Python environments for better performance
- Note: Bash tools should be tested in Linux environments or WSL, as they may not function correctly in Windows
Usage Modes
Local Mode (preferred)
adapter = ToolsAdapter(folder_path="/path/to/tools")
API Mode (rate limits apply)
adapter = ToolsAdapter(api_key="your_api_key")
Performance Considerations
- Python Tools: Best for Python environments, executed in-process with minimal overhead
- Bash Tools: Better for non-Python servers or when isolation is needed
- For maximum performance in non-Python environments, prefer Bash tools
Security and Privacy
- All tool execution happens locally within your environment
- No code is sent to external servers for execution
- Environment variables can be securely passed to tools
Integration with AI Models
OpenAPITools provides native integration with:
- Anthropic's Claude
- OpenAI's GPT models
- LangChain frameworks
This allows you to build AI assistants that can leverage tools to perform complex tasks.
Visit docs.openapitools.com for more information on how to use the OpenAPITools SDK, including detailed examples and API references.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.