
Microsoft Fabric Analytics
STDIOMCP server providing Microsoft Fabric analytics, query, and monitoring capabilities for AI assistants.
MCP server providing Microsoft Fabric analytics, query, and monitoring capabilities for AI assistants.
A comprehensive Model Context Protocol (MCP) server that provides analytics capabilities and tools for interacting with Microsoft Fabric data platform. This server enables AI assistants like Claude to seamlessly access, analyze, and monitor Microsoft Fabric resources through standardized MCP protocols, bringing the power of Microsoft Fabric directly to your AI conversations.
Recommended for AI Assistant Usage:
{ "mcpServers": { "fabric-analytics": { "command": "node", "args": ["C:\\path\\to\\your\\build\\index.js"], "cwd": "C:\\path\\to\\your\\project", "env": { "FABRIC_AUTH_METHOD": "bearer_token", "FABRIC_TOKEN": "your_bearer_token_here", "FABRIC_WORKSPACE_ID": "your_workspace_id", "ENABLE_HEALTH_SERVER": "false" } } } }
💡 Get Bearer Token: Visit Power BI Embed Setup to generate tokens
⚠️ Important: Tokens expire after ~1 hour and need to be refreshed
If you experience 60-second timeouts during startup, this is due to interactive authentication flows blocking Claude Desktop's sandboxed environment. Solution:
Use Bearer Token Method (Recommended):
FABRIC_AUTH_METHOD: "bearer_token"
in your configFABRIC_TOKEN
with a valid bearer tokenAlternative - Per-Tool Authentication:
bearerToken: "your_token_here"
bearerToken: "simulation"
Troubleshooting:
🎯 Quick Fix: The server automatically prioritizes
FABRIC_TOKEN
environment variable over interactive authentication flows, preventing Claude Desktop timeouts.
# Clone and run locally git clone https://github.com/santhoshravindran7/Fabric-Analytics-MCP.git cd Fabric-Analytics-MCP npm install && npm run build && npm start
# Using Docker Compose docker-compose up -d # Or standalone Docker docker build -t fabric-analytics-mcp . docker run -p 3000:3000 -e FABRIC_CLIENT_ID=xxx fabric-analytics-mcp
# One-command enterprise deployment export ACR_NAME="your-registry" FABRIC_CLIENT_ID="xxx" FABRIC_CLIENT_SECRET="yyy" FABRIC_TENANT_ID="zzz" ./scripts/setup-azure-resources.sh && ./scripts/build-and-push.sh && ./scripts/deploy-to-aks.sh
# Serverless deployment on Azure az mcp server create --name "fabric-analytics-mcp" --repository "santhoshravindran7/Fabric-Analytics-MCP"
📚 Detailed Guides:
Tool: list-fabric-items
Description: List items in a Microsoft Fabric workspace (Lakehouses, Notebooks, etc.)
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemType
: Filter by item type (optional)Tool: create-fabric-item
Description: Create new items in Microsoft Fabric workspace
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemType
: Type of item (Lakehouse, Notebook, Dataset, Report, Dashboard)displayName
: Display name for the new itemdescription
: Optional descriptionTool: get-fabric-item
Description: Get detailed information about a specific Microsoft Fabric item
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemId
: ID of the item to retrieveTool: update-fabric-item
Description: Update existing items in Microsoft Fabric workspace
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemId
: ID of the item to updatedisplayName
: New display name (optional)description
: New description (optional)Tool: delete-fabric-item
Description: Delete items from Microsoft Fabric workspace
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemId
: ID of the item to deletequery-fabric-dataset
bearerToken
: Microsoft Fabric bearer token (optional - uses simulation if not provided)workspaceId
: Microsoft Fabric workspace IDdatasetName
: Name of the dataset to queryquery
: SQL or KQL query to executeexecute-fabric-notebook
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: ID of the notebook to executeparameters
: Optional parameters to pass to the notebookget-fabric-metrics
workspaceId
: Microsoft Fabric workspace IDitemId
: Item ID (dataset, report, etc.)timeRange
: Time range for metrics (1h, 24h, 7d, 30d)metrics
: List of metrics to analyzeanalyze-fabric-model
workspaceId
: Microsoft Fabric workspace IDitemId
: Item ID to analyzegenerate-fabric-report
workspaceId
: Microsoft Fabric workspace IDreportType
: Type of report (performance, usage, health, summary)Tool: create-livy-session
Description: Create a new Livy session for interactive Spark/SQL execution
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionConfig
: Optional session configurationTool: get-livy-session
Description: Get details of a Livy session
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionId
: Livy session IDTool: list-livy-sessions
Description: List all Livy sessions in a lakehouse
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDTool: delete-livy-session
Description: Delete a Livy session
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionId
: Livy session IDTool: execute-livy-statement
Description: Execute SQL or Spark statements in a Livy session
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionId
: Livy session IDcode
: SQL or Spark code to executekind
: Statement type (sql, spark, etc.)Tool: get-livy-statement
Description: Get status and results of a Livy statement
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionId
: Livy session IDstatementId
: Statement IDTool: create-livy-batch
Description: Create a new Livy batch job for long-running operations
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDbatchConfig
: Batch job configurationTool: get-livy-batch
Description: Get details of a Livy batch job
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDbatchId
: Batch job IDTool: list-livy-batches
Description: List all Livy batch jobs in a lakehouse
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDTool: delete-livy-batch
Description: Delete a Livy batch job
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDbatchId
: Batch job IDget-workspace-spark-applications
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDcontinuationToken
: Optional token for paginationTool: get-notebook-spark-applications
Description: Get all Spark applications for a specific notebook
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: Notebook IDcontinuationToken
: Optional token for paginationTool: get-lakehouse-spark-applications
Description: Get all Spark applications for a specific lakehouse
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Lakehouse IDcontinuationToken
: Optional token for paginationTool: get-spark-job-definition-applications
Description: Get all Spark applications for a specific Spark Job Definition
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDsparkJobDefinitionId
: Spark Job Definition IDcontinuationToken
: Optional token for paginationTool: get-spark-application-details
Description: Get detailed information about a specific Spark application
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlivyId
: Livy session IDTool: cancel-spark-application
Description: Cancel a running Spark application
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlivyId
: Livy session IDget-spark-monitoring-dashboard
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDThe MCP server provides comprehensive notebook management capabilities with predefined templates and custom notebook support.
create-fabric-notebook
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDdisplayName
: Display name for the new notebooktemplate
: Template type (blank, sales_analysis, nyc_taxi_analysis, data_exploration, machine_learning, custom)customNotebook
: Custom notebook definition (required if template is 'custom')environmentId
: Optional environment ID to attachlakehouseId
: Optional default lakehouse IDlakehouseName
: Optional default lakehouse nameAvailable Templates:
get-fabric-notebook-definition
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: ID of the notebook to retrieveformat
: Format to return (ipynb or fabricGitSource)update-fabric-notebook-definition
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: ID of the notebook to updatenotebookDefinition
: Updated notebook definition objectrun-fabric-notebook
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: ID of the notebook to runparameters
: Optional notebook parameters (key-value pairs with types)configuration
: Optional execution configuration (environment, lakehouse, pools, etc.)Features:
Clone and Install
git clone https://github.com/santhoshravindran7/Fabric-Analytics-MCP.git cd Fabric-Analytics-MCP npm install npm run build # ✅ All configuration files included!
📝 Note: All essential configuration files (
tsconfig.json
,jest.config.json
, etc.) are now properly included in the repository. Previous build issues have been resolved.
Configure Claude Desktop
Add to your Claude Desktop config:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{ "mcpServers": { "fabric-analytics": { "command": "node", "args": ["/ABSOLUTE/PATH/TO/PROJECT/build/index.js"] } } }
Start Using
Restart Claude Desktop and try these queries: - "List all items in my Fabric workspace [your-workspace-id]"
npm start # Production mode npm run dev # Development mode with auto-reload
For comprehensive testing of Spark functionality, install Python dependencies:
pip install -r livy_requirements.txt
Available Test Scripts:
livy_api_test.ipynb
- Interactive notebook for step-by-step testingcomprehensive_livy_test.py
- Full-featured test with error handlingspark_monitoring_test.py
- Spark application monitoring testsmcp_spark_monitoring_demo.py
- MCP server integration demoAdd this configuration to your Claude Desktop config file:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{ "mcpServers": { "fabric-analytics": { "command": "node", "args": ["/ABSOLUTE/PATH/TO/PROJECT/build/index.js"] } } }
🎉 You're ready! Restart Claude Desktop and start asking questions about your Microsoft Fabric data!
For testing the Livy API functionality, additional Python dependencies are required:
# Install Python dependencies for Livy API testing pip install -r livy_requirements.txt
livy_api_test.ipynb
- Interactive Jupyter notebook for step-by-step testingcomprehensive_livy_test.py
- Full-featured test with error handlingsimple_livy_test.py
- Simple test following example patternslivy_batch_test.py
- Batch job testing capabilitieslivy_setup.py
- Quick setup and configuration helpernpm start
npm run dev
Add the following configuration to your Claude Desktop config file:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{ "mcpServers": { "fabric-analytics": { "command": "node", "args": ["/ABSOLUTE/PATH/TO/PROJECT/build/index.js"] } } }
Once connected to Claude Desktop, you can ask natural language questions like:
This MCP server supports multiple authentication methods powered by Microsoft Authentication Library (MSAL):
🤖 For Claude Desktop: Use Bearer Token Authentication (Method #1) for the best experience and compatibility.
🔧 Claude Desktop Fix: Recent updates prevent authentication timeouts by prioritizing bearer tokens and adding timeout protection for interactive authentication flows.
Perfect for AI assistants and interactive usage:
For Claude Desktop:
claude_desktop_config.json
For Testing:
# All test scripts will prompt for authentication method python enhanced_auth_test.py
Use Azure AD application credentials:
Environment Variables Setup:
export FABRIC_AUTH_METHOD="service_principal" export FABRIC_CLIENT_ID="your-app-client-id" export FABRIC_CLIENT_SECRET="your-app-client-secret" export FABRIC_TENANT_ID="your-tenant-id" export FABRIC_DEFAULT_WORKSPACE_ID="your-workspace-id"
Claude Desktop Configuration:
{ "mcpServers": { "fabric-analytics": { "command": "node", "args": ["/path/to/build/index.js"], "env": { "FABRIC_AUTH_METHOD": "service_principal", "FABRIC_CLIENT_ID": "your-client-id", "FABRIC_CLIENT_SECRET": "your-client-secret", "FABRIC_TENANT_ID": "your-tenant-id" } } } }
Sign in with browser on another device (great for headless environments):
export FABRIC_AUTH_METHOD="device_code" export FABRIC_CLIENT_ID="your-client-id" export FABRIC_TENANT_ID="your-tenant-id"
Automatic browser-based authentication:
export FABRIC_AUTH_METHOD="interactive" export FABRIC_CLIENT_ID="your-client-id" export FABRIC_TENANT_ID="your-tenant-id"
📚 Detailed Guides:
Check your authentication status:
"Check my Fabric authentication status"
"What authentication method am I using?"
"Test my Microsoft Fabric authentication setup"
Note: The MCP server seamlessly handles token validation and provides clear error messages for authentication issues.
Deploy the MCP server as a scalable service on Azure Kubernetes Service for enterprise production use.
# Build the Docker image npm run docker:build # Tag and push to Azure Container Registry npm run docker:push
# Create Azure resources and deploy ./scripts/deploy-to-aks.sh
Once deployed, your MCP server will be available at:
https://your-aks-cluster.region.cloudapp.azure.com/mcp
The AKS deployment includes:
All Kubernetes manifests are located in the /k8s
directory:
namespace.yaml
- Dedicated namespacedeployment.yaml
- Application deployment with scalingservice.yaml
- Load balancer serviceingress.yaml
- External access and SSLconfigmap.yaml
- Configuration managementsecret.yaml
- Secure credential storagehpa.yaml
- Horizontal Pod AutoscalerConfigure the deployment by setting these environment variables:
export AZURE_SUBSCRIPTION_ID="your-subscription-id" export AZURE_RESOURCE_GROUP="fabric-mcp-rg" export AKS_CLUSTER_NAME="fabric-mcp-cluster" export ACR_NAME="fabricmcpregistry" export DOMAIN_NAME="your-domain.com"
The AKS deployment includes enterprise-grade security:
The deployment scripts support:
📚 Detailed Guide: See AKS_DEPLOYMENT.md for complete setup instructions.
Microsoft Azure now offers a preview service for hosting MCP servers natively. This eliminates the need for custom infrastructure management.
# Login to Azure az login # Enable MCP preview features az extension add --name mcp-preview # Deploy the MCP server az mcp server create \ --name "fabric-analytics-mcp" \ --resource-group "your-rg" \ --source-type "github" \ --repository "santhoshravindran7/Fabric-Analytics-MCP" \ --branch "main" \ --auth-method "service-principal"
# Set up service principal authentication az mcp server config set \ --name "fabric-analytics-mcp" \ --setting "FABRIC_CLIENT_ID=your-client-id" \ --setting "FABRIC_CLIENT_SECRET=your-secret" \ --setting "FABRIC_TENANT_ID=your-tenant-id"
# Get the server endpoint az mcp server show --name "fabric-analytics-mcp" --query "endpoint"
Azure MCP Server offers:
📚 Learn More: Azure MCP Server Documentation
Note: Azure MCP Server is currently in preview. Check Azure Preview Terms for service availability and limitations.
This MCP server is built with:
The server uses the following configuration files:
tsconfig.json
- TypeScript compiler configurationpackage.json
- Node.js package configuration.vscode/mcp.json
- MCP server configuration for VS Code├── src/
│ ├── index.ts # Main MCP server implementation
│ └── fabric-client.ts # Microsoft Fabric API client
├── build/ # Compiled JavaScript output
├── tests/ # Test scripts and notebooks
├── .vscode/ # VS Code configuration
├── package.json
├── tsconfig.json
└── README.md
To add new tools to the server:
server.tool()
This server includes:
✅ Production Ready:
🧪 Demonstration Features:
# Install Python dependencies for API testing pip install -r livy_requirements.txt
livy_api_test.ipynb
- Interactive Jupyter notebook for step-by-step testingcomprehensive_livy_test.py
- Full-featured test with error handlingsimple_livy_test.py
- Simple test following example patternslivy_batch_test.py
- Batch job testing capabilitiesspark_monitoring_test.py
- Spark application monitoring testsInteractive Testing:
jupyter notebook livy_api_test.ipynb
Command Line Testing:
python simple_livy_test.py python spark_monitoring_test.py
Comprehensive Testing:
undefined
We welcome contributions! Here's how to get started:
git checkout -b feature/amazing-feature
)git commit -m 'Add amazing feature'
)git push origin feature/amazing-feature
)This project is licensed under the MIT License - see the LICENSE file for details.
For issues and questions:
This project began as my weekend hack project exploring AI integration with Microsoft Fabric. During a casual conversation with Chris and Bogdan about making AI tooling more accessible. What started as a personal experiment over a weekend is now available for everyone to build upon.