
Prometheus
STDIOMCP server enabling natural language interactions with Prometheus monitoring infrastructure for querying metrics
MCP server enabling natural language interactions with Prometheus monitoring infrastructure for querying metrics
A Model Context Protocol (MCP) server that provides seamless integration between AI assistants and Prometheus, enabling natural language interactions with your monitoring infrastructure. This server allows for effortless querying, discovery, and analysis of metrics through Visual Studio Code, Cursor, Windsurf, Claude Desktop, and other MCP clients.
First, install the Prometheus MCP server with your client. A typical configuration looks like this:
{ "mcpServers": { "prometheus": { "command": "npx", "args": ["prometheus-mcp@latest", "stdio"], "env": { "PROMETHEUS_URL": "http://localhost:9090" } } } }
# For VS Code code --add-mcp '{"name":"prometheus","command":"npx","args":["prometheus-mcp@latest","stdio"],"env":{"PROMETHEUS_URL":"http://localhost:9090"}}' # For VS Code Insiders code-insiders --add-mcp '{"name":"prometheus","command":"npx","args":["prometheus-mcp@latest","stdio"],"env":{"PROMETHEUS_URL":"http://localhost:9090"}}'
After installation, the Prometheus MCP server will be available for use with your GitHub Copilot agent in VS Code.
Go to Cursor Settings
→ MCP
→ Add new MCP Server
. Name to your liking, use command
type with the command npx prometheus-mcp
. You can also verify config or add command arguments via clicking Edit
.
{ "mcpServers": { "prometheus": { "command": "npx", "args": ["prometheus-mcp@latest", "stdio"], "env": { "PROMETHEUS_URL": "http://localhost:9090" } } } }
Follow Windsurf MCP documentation. Use the following configuration:
{ "mcpServers": { "prometheus": { "command": "npx", "args": ["prometheus-mcp@latest", "stdio"], "env": { "PROMETHEUS_URL": "http://localhost:9090" } } } }
Follow the MCP install guide, use the following configuration:
{ "mcpServers": { "prometheus": { "command": "npx", "args": ["prometheus-mcp@latest", "stdio"], "env": { "PROMETHEUS_URL": "http://localhost:9090" } } } }
Prometheus MCP server supports the following arguments. They can be provided in the JSON configuration above, as part of the "args"
list:
> npx prometheus-mcp@latest --help Commands: stdio Start Prometheus MCP server using stdio transport http Start Prometheus MCP server using HTTP transport Options: --help Show help [boolean] --version Show version number [boolean]
You can also configure the server using environment variables:
PROMETHEUS_URL
- Prometheus server URLENABLE_DISCOVERY_TOOLS
- Set to "false" to disable discovery tools (default: true)ENABLE_INFO_TOOLS
- Set to "false" to disable info tools (default: true)ENABLE_QUERY_TOOLS
- Set to "false" to disable query tools (default: true)When running in server environments or when you need HTTP transport, run the MCP server with the http
command:
npx prometheus-mcp@latest http --port 3000
And then in your MCP client config, set the url
to the HTTP endpoint:
{ "mcpServers": { "prometheus": { "command": "npx", "args": ["mcp-remote", "http://localhost:3000/mcp"] } } }
Run the Prometheus MCP server using Docker:
{ "mcpServers": { "prometheus": { "command": "docker", "args": [ "run", "-i", "--rm", "--init", "--pull=always", "-e", "PROMETHEUS_URL=http://host.docker.internal:9090", "ghcr.io/idanfishman/prometheus-mcp", "stdio" ] } } }
The Prometheus MCP server provides 10 tools organized into three configurable categories:
Tools for exploring your Prometheus infrastructure:
prometheus_list_metrics
prometheus_metric_metadata
metric
(string): Metric name to get metadata forprometheus_list_labels
prometheus_label_values
label
(string): Label name to get values forprometheus_list_targets
prometheus_scrape_pool_targets
scrapePool
(string): Scrape pool nameTools for accessing Prometheus server information:
prometheus_runtime_info
prometheus_build_info
Tools for executing Prometheus queries:
prometheus_query
query
(string): Prometheus query expressiontime
(string, optional): Time parameter for the query (RFC3339 format)prometheus_query_range
query
(string): Prometheus query expressionstart
(string): Start timestamp (RFC3339 or unix timestamp)end
(string): End timestamp (RFC3339 or unix timestamp)step
(string): Query resolution step widthHere are some example interactions you can have with your AI assistant:
http_requests_total
metric?"This project is licensed under the MIT License - see the LICENSE file for details.
Built with ❤️ for the Prometheus and MCP communities