
Kafka
STDIOMCP server for Apache Kafka operations through standardized interface for LLM models.
MCP server for Apache Kafka operations through standardized interface for LLM models.
A Model Context Protocol (MCP) server for Apache Kafka implemented in Go, leveraging franz-go and mcp-go.
This server provides an implementation for interacting with Kafka via the MCP protocol, enabling LLM models to perform common Kafka operations through a standardized interface.
The Kafka MCP Server bridges the gap between LLM models and Apache Kafka, allowing them to:
All through the standardized Model Context Protocol (MCP).
graph TB subgraph "MCP Client (AI Applications)" A[Claude Desktop] B[Cursor] C[Windsurf] D[ChatWise] end subgraph "Kafka MCP Server" E[MCP Protocol Handler] F[Tools Registry] G[Resources Registry] H[Prompts Registry] I[Kafka Client Wrapper] end subgraph "Apache Kafka Cluster" J[Broker 1] K[Broker 2] L[Broker 3] M[Topics & Partitions] N[Consumer Groups] end A --> E B --> E C --> E D --> E E --> F E --> G E --> H F --> I G --> I H --> I I --> J I --> K I --> L J --> M K --> M L --> M J --> N K --> N L --> N classDef client fill:#e1f5fe classDef mcp fill:#f3e5f5 classDef kafka fill:#fff3e0 class A,B,C,D client class E,F,G,H,I mcp class J,K,L,M,N kafka
How it works:
The easiest way to install kafka-mcp-server is using Homebrew:
# Add the tap repository brew tap tuannvm/mcp # Install kafka-mcp-server brew install kafka-mcp-server
To update to the latest version:
brew update && brew upgrade kafka-mcp-server
# Clone the repository git clone https://github.com/tuannvm/kafka-mcp-server.git cd kafka-mcp-server # Build the server go build -o kafka-mcp-server ./cmd
This MCP server can be integrated with several AI applications. Below are platform-specific instructions:
Edit ~/.cursor/mcp.json
and add the kafka-mcp-server configuration:
{ "mcpServers": { "kafka": { "command": "kafka-mcp-server", "args": [], "env": { "KAFKA_BROKERS": "localhost:9092", "KAFKA_CLIENT_ID": "kafka-mcp-server", "MCP_TRANSPORT": "stdio" } } } }
Edit your Claude configuration file and add the server:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
{ "mcpServers": { "kafka": { "command": "kafka-mcp-server", "args": [], "env": { "KAFKA_BROKERS": "localhost:9092", "KAFKA_CLIENT_ID": "kafka-mcp-server", "MCP_TRANSPORT": "stdio" } } } }
Restart Claude Desktop to apply changes.
To use with Claude Code, add the server using the built-in MCP configuration command:
# Add kafka-mcp-server with environment variables claude mcp add kafka \ --env KAFKA_BROKERS=localhost:9092 \ --env KAFKA_CLIENT_ID=kafka-mcp-server \ --env MCP_TRANSPORT=stdio \ --env KAFKA_SASL_MECHANISM= \ --env KAFKA_SASL_USER= \ --env KAFKA_SASL_PASSWORD= \ --env KAFKA_TLS_ENABLE=false \ -- kafka-mcp-server
Other useful commands:
# List configured MCP servers claude mcp list # Remove server claude mcp remove kafka # Test server connection claude mcp get kafka
kafka
kafka-mcp-server
KAFKA_BROKERS=localhost:9092
KAFKA_CLIENT_ID=kafka-mcp-server
MCP_TRANSPORT=stdio
Managing MCP server configurations across multiple clients can become challenging. mcpenetes is a dedicated tool that makes this process significantly easier:
# Install mcpenetes go install github.com/tuannvm/mcpenetes@latest
# Search for available MCP servers including kafka-mcp-server mcpenetes search # Apply kafka-mcp-server configuration to all your clients at once mcpenetes apply # Load a configuration from your clipboard mcpenetes load
With mcpenetes, you can maintain multiple Kafka configurations (development, production, etc.) and switch between them instantly across all your clients (Cursor, Claude Desktop, Windsurf, ChatWise) without manually editing each client's configuration files.
The server exposes the following tools for Kafka interaction. For detailed documentation including examples and sample responses, see docs/tools.md.
The server provides the following resources that can be accessed through the MCP protocol. For detailed documentation including example responses, see docs/resources.md.
The server includes the following pre-configured prompts for Kafka operations and diagnostics. For detailed documentation including arguments and example responses, see docs/prompts.md.
Here's an interaction example showing how an AI assistant might use these tools to help a user:
User Query: "We need to investigate why our order processing pipeline is falling behind. Can you help me check the consumer lag?"
AI Assistant's workflow:
First, check the consumer groups
> Using list_consumer_groups tool
> Discovers "order-processor" group
Then, examine the consumer group details
> Using describe_consumer_group tool with group_id "order-processor" and include_offsets=true
> Discovers lag on several partitions
Get cluster overview to check general health
> Using cluster_overview tool
> Confirms no under-replicated partitions or offline brokers
Generate a consumer lag report
> Using consumer_lag_report prompt with threshold=1000
> Provides detailed analysis of lag across consumer groups
Provide actionable recommendations:
Based on my analysis, your order-processor consumer group has significant lag on topic orders, partitions 2 and 3, with lag values of 15,420 and 12,305 messages respectively.
The cluster itself is healthy with no under-replicated partitions, suggesting this is an issue with the consumer application rather than Kafka infrastructure.
I recommend:
1. Check the consumer instances processing those partitions for slowdowns or errors
2. Consider scaling up your consumer group by adding more instances
3. Review the consumer configuration, particularly max.poll.records and max.poll.interval.ms
4. Examine any processing bottlenecks in the consumer logic
This seamless workflow demonstrates how the Kafka MCP tools enable LLM models to perform sophisticated diagnostics and provide actionable insights.
The server can be configured using the following environment variables:
Variable | Description | Default |
---|---|---|
KAFKA_BROKERS | Comma-separated list of Kafka broker addresses | localhost:9092 |
KAFKA_CLIENT_ID | Kafka client ID used for connections | kafka-mcp-server |
MCP_TRANSPORT | MCP transport method (stdio/http) | stdio |
KAFKA_SASL_MECHANISM | SASL mechanism: plain , scram-sha-256 , scram-sha-512 , or "" (disabled) | "" |
KAFKA_SASL_USER | Username for SASL authentication | "" |
KAFKA_SASL_PASSWORD | Password for SASL authentication | "" |
KAFKA_TLS_ENABLE | Enable TLS for Kafka connection (true or false ) | false |
KAFKA_TLS_INSECURE_SKIP_VERIFY | Skip TLS certificate verification (true or false ) | false |
Security Note: When using
KAFKA_TLS_INSECURE_SKIP_VERIFY=true
, the server will skip TLS certificate verification. This should only be used in development or testing environments, or when using self-signed certificates.
The server is designed with enterprise-grade security in mind:
Comprehensive test coverage ensures reliability:
# Run all tests (requires Docker for integration tests) go test ./... # Run tests excluding integration tests go test -short ./... # Run integration tests with specific Kafka brokers export KAFKA_BROKERS="your-broker:9092" export SKIP_KAFKA_TESTS="false" go test ./kafka -v -run Test
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.