Infor Birst MCP Server
Enables AI-driven interaction with Infor Birst analytics platform, providing 20+ tools for querying data warehouses, managing spaces and reports, executing workflows, and converting natural language to BQL queries.
README
Infor Birst MCP Server
An MCP (Model Context Protocol) server that exposes Infor Birst's analytics platform capabilities as AI-consumable tools.
Features
20 MCP Tools enabled by default, organized in 5 tiers (up to 25 with optional features):
Tier 1: Core Query Tools (0-5 tools, optional)
These tools require additional Birst entitlements and are disabled by default:
GenAI Tools (require Birst AI entitlement - set BIRST_ENABLE_GENAI=true):
birst_generate_bql- Convert natural language to BQL queriesbirst_search_data- Search data warehouse with natural languagebirst_generate_chart- Generate chart specifications from text
ICW Tools (require application provisioning - set BIRST_ENABLE_ICW=true):
birst_execute_query- Execute BQL queries and return resultsbirst_validate_query- Validate BQL syntax before execution
Tier 2: Discovery Tools (6 tools)
birst_list_spaces- List all accessible analytical spacesbirst_get_space- Get detailed space informationbirst_list_reports- List reports in a space (with pagination)birst_list_dashboards- List dashboards in collectionsbirst_list_collections- List collections in a spacebirst_search_catalog- Search catalog entities
Tier 3: Infrastructure Tools (5 tools)
birst_list_connections- List data connectionsbirst_list_sources- List data sourcesbirst_list_variables- List space variablesbirst_list_hierarchies- List dimensional hierarchiesbirst_get_dataflow- Get data flow visualization
Tier 4: Workflow Tools (4 tools)
birst_list_workflows- List available workflowsbirst_run_workflow- Trigger workflow executionbirst_get_workflow_status- Monitor workflow statusbirst_list_workflow_runs- Get workflow execution history
Tier 5: Administration Tools (4 tools)
birst_list_users- List users in accountbirst_list_space_users- List users with space accessbirst_list_account_groups- List account groupsbirst_list_space_groups- List space groups
Installation
npm install
Configuration
Environment Variables
| Variable | Description | Default |
|---|---|---|
BIRST_ENV |
Environment to use (TST, PRD, TRN) | TST |
BIRST_IONAPI_PATH |
Custom path to .ionapi file | Auto-detected |
BIRST_LOG_LEVEL |
Log level (debug, info, warn, error) | info |
BIRST_ENABLE_GENAI |
Enable GenAI tools (requires Birst AI entitlement) | false |
BIRST_ENABLE_ICW |
Enable ICW query tools (requires app provisioning) | false |
ION API Credentials
Place your .ionapi files in the project root or credentials/ directory:
infor-birst-mcp/
├── TST.ionapi # Test environment
├── PRD.ionapi # Production environment
└── TRN.ionapi # Training environment
The .ionapi file is obtained from Infor Cloud Suite and contains OAuth2 credentials.
Usage
Development
npm run dev
Production
npm run build
npm start
Claude Code Configuration
Add to your Claude Code MCP settings:
{
"mcpServers": {
"infor-birst": {
"command": "node",
"args": ["/path/to/infor-birst-mcp/dist/index.js"],
"env": {
"BIRST_ENV": "TST"
}
}
}
}
Or for development:
{
"mcpServers": {
"infor-birst": {
"command": "npx",
"args": ["tsx", "/path/to/infor-birst-mcp/src/index.ts"],
"env": {
"BIRST_ENV": "TST"
}
}
}
}
Example Usage
Once configured, you can use the tools in Claude:
"List all my Birst spaces"
→ Uses birst_list_spaces
"Show me reports in this space"
→ Uses birst_list_reports
"What workflows are scheduled to run?"
→ Uses birst_list_workflows with filter=isScheduled
"Who has access to this space?"
→ Uses birst_list_space_users
With GenAI enabled (BIRST_ENABLE_GENAI=true):
"Show me sales by region for 2024"
→ Uses birst_generate_bql + birst_execute_query
Supported Environments
- TST - Test environment
- PRD - Production environment
- TRN - Training environment
Switch environments by setting the BIRST_ENV variable or using the default (TST).
API Coverage
This MCP server interfaces with 3 Birst APIs:
| API | Endpoints | Description |
|---|---|---|
| REST API | 156 | Space management, users, workflows |
| GenAI API | 3 | Natural language to BQL, chart generation |
| ICW API | 5 | Query execution and validation |
Security
- Credentials are stored in
.ionapifiles (gitignored) - OAuth2 tokens are automatically refreshed
- All API calls are authenticated via Bearer token
- Sensitive data is never logged
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.