icon for mcp server

GPT Vis

STDIO

本地图表生成MCP服务器无外部依赖

GPT-Vis MCP Server

A local/private-favored wrapper for antvis/mcp-server-chart that generates charts locally without external server dependencies.

License: MIT npm version MCP

✨ Features

  • 🔒 Private & Secure: Local chart generation, no external dependencies
  • 🚀 Easy Setup: One command installation for Claude Desktop
  • 🎨 Rich Charts: 20+ chart types (pie, line, bar, radar, maps, etc.)
  • 📊 Enterprise Ready: Perfect for secure environments

🚀 Usage

Claude Desktop (Recommended)

Add to your Claude Desktop MCP settings:

Option 1: NPX (Recommended)

{ "mcpServers": { "gpt-vis-mcp": { "command": "npx", "args": ["-y", "@jsr2npm/yao__gpt-vis-mcp@latest"] } } }

Option 2: Deno Direct

{ "mcpServers": { "gpt-vis-mcp": { "command": "deno", "args": ["run", "--allow-all", "@yao/gpt-vis-mcp/bin"] } } }

Option 3: NPX + Deno

{ "mcpServers": { "gpt-vis-mcp": { "command": "npx", "args": ["-y", "deno", "run", "--allow-all", "@yao/gpt-vis-mcp/bin"] } } }

image

You may experience canvas dependency issues and font rendering issues when using npx. If so, try option 4.

Option 4: Docker

{ "mcpServers": { "gpt-vis-mcp": { "command": "docker", "args": [ "run", "--interactive", "--rm", "-v", "/tmp/gpt-vis-charts:/tmp/gpt-vis-charts", "ghcr.io/yaonyan/gpt-vis-mcp:latest-mcp" ] } } }

Direct Command Line Usage

You can also run the server directly with various transport modes:

# Using JSR package (recommended) # Default stdio mode (for MCP clients) deno run --allow-all @yao/gpt-vis-mcp/bin # SSE server mode deno run --allow-all @yao/gpt-vis-mcp/bin --transport sse --port 3000 --host localhost # Short form deno run --allow-all @yao/gpt-vis-mcp/bin -t sse -p 3000 -h localhost # Show help deno run --allow-all @yao/gpt-vis-mcp/bin --help
# Alternative: Using npx with deno # Default stdio mode npx -y deno run --allow-all @yao/gpt-vis-mcp/bin # SSE server mode npx -y deno run --allow-all @yao/gpt-vis-mcp/bin --transport sse --port 3000 # Show help npx -y deno run --allow-all @yao/gpt-vis-mcp/bin --help

Transport Modes:

  • stdio: Direct MCP protocol communication via stdin/stdout (default)
  • sse: Server-Sent Events HTTP server for web clients

Docker with SSE Mode

# Run with SSE server docker run -p 3000:3000 ghcr.io/yaonyan/gpt-vis-mcp:latest-mcp --transport sse --port 3000 --host 0.0.0.0 # Access the SSE endpoint curl http://localhost:3000/sse

Set environment variables as needed:

VariableDescriptionDefault
RENDERED_IMAGE_PATHChart images directorysystem temp
RENDERED_IMAGE_HOST_PATHBase URL for accessing images(optional)

Docker SSR Server

We also provide an SSR Server that follows the requirements of https://github.com/antvis/mcp-server-chart?tab=readme-ov-file#-private-deployment

# Run SSR API server docker run -p 3000:3000 -e RENDERED_IMAGE_HOST_PATH=http://localhost:3000/charts ghcr.io/yaonyan/gpt-vis-mcp:latest-http # Test the SSR API curl -X POST http://localhost:3000/generate \ -H "Content-Type: application/json" \ -d '{"type": "pie", "data": [{"category": "A", "value": 30}, {"category": "B", "value": 70}]}' {"success":true,"resultObj":"http://localhost:3000/charts/chart_1750500506056_T6IC0Vtp.png"}

You can then use the SSR server with the upstream @antv/mcp-server-chart by specifying the VIS_REQUEST_SERVER environment variable:

{ "mcpServers": { "mcp-server-chart": { "command": "npx", "args": ["-y", "@antv/mcp-server-chart"], "env": { "VIS_REQUEST_SERVER": "http://localhost:3000/generate" } } } }

This allows you to use the original MCP server while leveraging your private SSR endpoint for chart generation.

VIS_REQUEST_SERVER example

🤝 Contributing

See CONTRIBUTING.md for development setup.

📄 License

MIT License - see LICENSE

MCP Now 重磅来袭,抢先一步体验