
Maximo MCP Server
An API server that enables interaction with IBM Maximo resources like Assets and Work Orders, providing tool functions to retrieve and list asset information.
README
Maximo MCP Server
This project implements an MCP Server for the IBM Maximo API. It provides a set of tools to interact with Maximo resources like Assets, Work Orders, etc.
High-Level Flow
- The MCP client sends a request to the MCP Server.
- The MCP Server receives the request and calls the appropriate tool function.
- The tool function makes a request to the Maximo API.
- The Maximo API returns a response to the tool function.
- The tool function returns the response to the MCP Server.
- The MCP Server returns the response to the MCP client.
Files
mcp_server.py
: The main application file. It contains the Flask server and the tool implementations.requirements.txt
: The project dependencies..env
: The environment variables for the project.manifest.json
: The tool manifest file.README.md
: This file.
Tools
get_asset
: Retrieves details of a specific asset by its ID.list_assets
: Lists all assets, with optional filtering and pagination.
Note on HTTP Methods
The tool endpoints use the POST
method to receive parameters in a JSON payload, which is a standard practice for MCP servers, even for operations that fetch data.
list_assets
Parameters
page_size
(optional, default: 10): The number of assets to return per page.page_num
(optional, default: 1): The page number to return.where
(optional): A filter to apply to the query. The value should be a valid Maximooslc.where
clause. For example, to filter for assets with a status of "OPERATING", you would use"status=\"OPERATING\""
.
Running the Maximo AI Assistant
This project includes an interactive web application built with Streamlit that allows you to chat with an AI assistant powered by Gemini and your Maximo MCP server.
1. Set Up Environment
First, install the required Python packages:
pip install -r requirements.txt
You will also need to create a .env
file in the root of the project with your Maximo and Google API keys:
MAXIMO_API_URL=https://your-maximo-instance.com
MAXIMO_API_KEY=your-maximo-api-key
GOOGLE_API_KEY=your-google-api-key
2. Run the MCP Server
In your first terminal, start the MCP server:
python mcp_server.py
The server will start on http://localhost:5001
. Keep this terminal running.
3. Run the Streamlit App
In a new terminal window, run the Streamlit application:
streamlit run streamlit_app.py
The application will open in your web browser. You can now chat with the Maximo AI Assistant and ask it questions about your assets.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.