
Power BI
STDIOPower BI data querying through natural language using MCP protocol
Power BI data querying through natural language using MCP protocol
Transform your Power BI experience - ask questions in natural language and get instant insights from your data.
A Model Context Protocol (MCP) server that enables AI assistants to interact with Power BI datasets through natural language. Query your data, generate DAX, and get insights without leaving your AI assistant.
Ask questions like "What are total sales by region?" and get instant insights from your Power BI data.
Platform | Python | .NET Runtime | ADOMD.NET | Status |
---|---|---|---|---|
Windows | 3.10+ | ✅ Built-in | ✅ Available | ✅ Full Support |
Linux | 3.10+ | ✅ Available | ⚠️ Docker only | ✅ Docker Support |
macOS | 3.10+ | ✅ Available | ❌ Not available | ❌ Not supported |
Note: For Linux systems, use Docker to run the server with all dependencies included.
Clone the repository
git clone https://github.com/yourusername/powerbi-mcp-server.git cd powerbi-mcp-server
Install dependencies
pip install -r requirements.txt
Configure environment variables
cp .env.example .env # Edit .env with your credentials
Test the connection
python quickstart.py
Add to your Claude Desktop configuration file:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{ "mcpServers": { "powerbi": { "command": "python", "args": ["C:/path/to/powerbi-mcp-server/src/server.py"], "env": { "PYTHONPATH": "C:/path/to/powerbi-mcp-server", "OPENAI_API_KEY": "your-openai-api-key" } } } }
⚠️ Important: Docker containers do NOT use .env
files. The .env
file is excluded
from the Docker build context for security. You must provide environment variables via
docker run -e
, Docker Compose, or your cloud platform.
Build the container image:
docker build -t powerbi-mcp .
Run the server:
docker run -it --rm -e OPENAI_API_KEY=<key> powerbi-mcp
The container listens on port 8000
by default. Override the host or port using
environment variables or command-line arguments:
docker run -it --rm -e OPENAI_API_KEY=<key> -p 7000:7000 powerbi-mcp \ python src/server.py --host 0.0.0.0 --port 7000
The server exposes a Server-Sent Events endpoint at /sse
. Clients should
connect to this endpoint and then POST JSON-RPC messages to the path provided in
the initial endpoint
event (typically /messages/
).
The container includes the .NET runtime required by pythonnet
and pyadomd
.
It sets PYTHONNET_RUNTIME=coreclr
and DOTNET_ROOT=/usr/share/dotnet
so the
.NET runtime is detected automatically.
Important: The Docker container does NOT use .env
files. Any .env
file in your
local directory will be excluded from the Docker image via .dockerignore
for security reasons.
Instead, provide environment variables using:
docker run -e VARIABLE=value
The available environment variables mirror those in .env.example
.
Once configured, you can interact with your Power BI data through Claude:
Connect to Power BI dataset at powerbi://api.powerbi.com/v1.0/myorg/YourWorkspace
What tables are available?
Show me the structure of the Sales table
What are the total sales by product category?
Show me the trend of revenue over the last 12 months
Which store has the highest gross margin?
Execute DAX: EVALUATE SUMMARIZE(Sales, Product[Category], "Total", SUM(Sales[Amount]))
Power BI XMLA Endpoint
powerbi://api.powerbi.com/v1.0/myorg/WorkspaceName
Azure AD Service Principal
OpenAI API Key (optional)
gpt-4o-mini
(200x cheaper than GPT-4)Create a .env
file (OpenAI settings are optional):
# OpenAI Configuration (optional) OPENAI_API_KEY=your_openai_api_key_here OPENAI_MODEL=gpt-4o-mini # Defaults to gpt-4o-mini # Optional: Default Power BI Credentials # These values are used when the `connect_powerbi` action does not supply # tenant_id, client_id or client_secret. DEFAULT_TENANT_ID=your_tenant_id DEFAULT_CLIENT_ID=your_client_id DEFAULT_CLIENT_SECRET=your_client_secret # Logging LOG_LEVEL=INFO
powerbi-mcp-server/
├── src/
│ └── server.py # Main MCP server implementation
├── docs/ # Documentation
├── examples/ # Example queries and use cases
├── tests/ # Test suite
├── .env.example # Environment variables template
├── requirements.txt # Python dependencies
├── quickstart.py # Quick test script
└── README.md # This file
.env
files and keep them in .gitignore
Run the standard test suite:
python -m pytest tests/
Test specific functionality:
python tests/test_connector.py python tests/test_server_process.py
Real integration tests with Power BI datasets are available but disabled by default. These tests connect to actual Power BI services and may consume API quota.
Enable Integration Tests:
Configure test environment
cp .env.example .env # Edit .env file and set: ENABLE_INTEGRATION_TESTS=true
Set test dataset configuration
# Test Power BI Dataset Configuration TEST_XMLA_ENDPOINT=powerbi://api.powerbi.com/v1.0/myorg/YourTestWorkspace TEST_TENANT_ID=your_tenant_id TEST_CLIENT_ID=your_client_id TEST_CLIENT_SECRET=your_client_secret TEST_INITIAL_CATALOG=YourTestDatasetName # Optional: Expected test data for validation TEST_EXPECTED_TABLE=Sales TEST_EXPECTED_COLUMN=Amount TEST_DAX_QUERY=EVALUATE TOPN(1, Sales) TEST_MIN_TABLES_COUNT=1
Run integration tests
# Interactive runner with safety checks python run_integration_tests.py # Or directly with pytest python -m pytest tests/test_integration.py -v # Run with auto-confirmation (CI/CD) python run_integration_tests.py --yes
Integration Test Coverage:
⚠️ Warning: Integration tests connect to real Power BI datasets and may consume:
Only run integration tests in dedicated test environments.
We welcome contributions! Please see CONTRIBUTING.md for details.
git checkout -b feature/AmazingFeature
)git commit -m 'Add some AmazingFeature'
)git push origin feature/AmazingFeature
)# Check environment compatibility python scripts/check_test_environment.py # Run unit tests python -m pytest tests/ -k "not test_integration" -v # Run integration tests (requires .env configuration) python -m pytest tests/test_integration.py -v
ADOMD.NET not found
Connection fails
Timeout errors
See TROUBLESHOOTING.md for detailed solutions.
This project is licensed under the MIT License - see the LICENSE file for details.