MCP Connector: Integrating AI agent with Data Warehouse in Microsoft Fabric

MCP Connector: Integrating AI agent with Data Warehouse in Microsoft Fabric

MCP Client and Server apps to demo integration of Azure OpenAI-based AI agent with a Data Warehouse, exposed through GraphQL in Microsoft Fabric.

LazaUK

Research & Data
Visit Server

README

MCP Connector: Integrating AI agent with Data Warehouse in Microsoft Fabric

This repo demonstrates the integration of an Azure OpenAI-powered AI agent with a Microsoft Fabric data warehouse using the Model Context Protocol (MCP), open integration standard for AI agents by Anthropic.

MCP enables dynamic discovery of tools, data resources and prompt templates (with more coming soon), unifying their integration with AI agents. GraphQL provides an abstraction layer for universal data connection. Below, you will find detailed steps on how to combine MCP and GraphQL to enable bidirectional access to enterprise data for your AI agent.

[!NOTE] In the MCP server's script, some query parameter values are hard-coded for the sake of this example. In a real-world scenario, these values would be dynamically generated or retrieved.

Table of contents:

Part 1: Configuring Microsoft Fabric Backend

  1. In Microsoft Fabric, create a new data warehouse pre-populated by sample data by clicking New item -> Sample warehouse: Step1_SampleWarehouse
  2. Next, create a GraphQL API endpoint by clicking New item -> API for GraphQL: Step2_GraphQlCreate
  3. In the data configuration of GraphQL API, choose the Trip (dbo.Trip) table: Step3_GraphQLData.png
  4. Copy the endpoint URL of your GraphQL API: Step4_GraphQLDataURL.png

Part 2: Configuring Local Client Environment

  1. Install the required Python packages, listed in the provided requirements.txt:
pip install -r requirements.txt
  1. Configure environmnet variables for the MCP client:
Variable Description
AOAI_API_BASE Base URL of the Azure OpenAI endpoint
AOAI_API_VERSION API version of the Azure OpenAI endpoint
AOAI_DEPLOYMENT Deployment name of the Azure OpenAI model
  1. Set the value of the AZURE_FABRIC_GRAPHQL_ENDPOINT variable with the GraphQL endpoint URL from Step 1.4 above. It will be utilised by the MCP Server script to establish connectivity with Microsoft Fabric:
Variable Description
AZURE_FABRIC_GRAPHQL_ENDPOINT Microsoft Fabric's GraphQL API endpoint

Part 3: User Experience - Gradio UI

  1. Launch the MCP client in your command prompt:
python MCP_Client_Gradio.py
  1. Click the Initialise System button to start the MCP server and connect your AI agent to the Microsoft Fabric's GraphQL API endpoint: Step5_GradioLaunch.png
  2. You can now pull and push data to your data warehouse using GraphQL's queries and mutations enabled by this MCP connector: Step5_GradioUse.png

Part 4: Demo video on YouTube

A practical demo of the provided MCP connector can be found on this YouTube video.

Recommended Servers

Crypto Price & Market Analysis MCP Server

Crypto Price & Market Analysis MCP Server

A Model Context Protocol (MCP) server that provides comprehensive cryptocurrency analysis using the CoinCap API. This server offers real-time price data, market analysis, and historical trends through an easy-to-use interface.

Featured
TypeScript
MCP PubMed Search

MCP PubMed Search

Server to search PubMed (PubMed is a free, online database that allows users to search for biomedical and life sciences literature). I have created on a day MCP came out but was on vacation, I saw someone post similar server in your DB, but figured to post mine.

Featured
Python
dbt Semantic Layer MCP Server

dbt Semantic Layer MCP Server

A server that enables querying the dbt Semantic Layer through natural language conversations with Claude Desktop and other AI assistants, allowing users to discover metrics, create queries, analyze data, and visualize results.

Featured
TypeScript
mixpanel

mixpanel

Connect to your Mixpanel data. Query events, retention, and funnel data from Mixpanel analytics.

Featured
TypeScript
Sequential Thinking MCP Server

Sequential Thinking MCP Server

This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.

Featured
Python
Nefino MCP Server

Nefino MCP Server

Provides large language models with access to news and information about renewable energy projects in Germany, allowing filtering by location, topic (solar, wind, hydrogen), and date range.

Official
Python
Vectorize

Vectorize

Vectorize MCP server for advanced retrieval, Private Deep Research, Anything-to-Markdown file extraction and text chunking.

Official
JavaScript
Mathematica Documentation MCP server

Mathematica Documentation MCP server

A server that provides access to Mathematica documentation through FastMCP, enabling users to retrieve function documentation and list package symbols from Wolfram Mathematica.

Local
Python
kb-mcp-server

kb-mcp-server

An MCP server aimed to be portable, local, easy and convenient to support semantic/graph based retrieval of txtai "all in one" embeddings database. Any txtai embeddings db in tar.gz form can be loaded

Local
Python
Research MCP Server

Research MCP Server

The server functions as an MCP server to interact with Notion for retrieving and creating survey data, integrating with the Claude Desktop Client for conducting and reviewing surveys.

Local
Python