Fastly API Integration
STDIOComprehensive OpenAPI specification for Fastly API with enhanced documentation and AI optimization.
Comprehensive OpenAPI specification for Fastly API with enhanced documentation and AI optimization.
This repository contains a comprehensive, unofficial OpenAPI 3.0 specification for the Fastly API, created by reverse engineering the publicly available API documentation. It features significantly enhanced documentation compared to the official web docs, with detailed descriptions, examples, and structured schemas optimized for both human developers and AI agents.
This project provides three key resources for working with the Fastly API:
fastly-openapi.yaml
) - A comprehensive OpenAPI 3.0 schema for all Fastly API endpointsfastly-openapi-mcp.yaml
) - A streamlined subset optimized for AI agent consumptionfastly-mcp-server/
) - An MCP server implementation that lets AI models interact with Fastly via a standardized protocolThe repository includes a full Model Context Protocol (MCP) server for Fastly, available on NPM:
# Install globally npm install -g fastly-mcp-server # Or run directly npx fastly-mcp-server run
This MCP server enables AI assistants and agents to:
See the fastly-mcp-server directory for detailed usage examples and configuration options.
This is an unofficial specification and is not endorsed, supported, or guaranteed by Fastly. It may be incomplete or contain inaccuracies. The specification is provided "as is" without warranty of any kind.
This repository contains two OpenAPI specifications:
The complete specification of the Fastly API, containing all endpoints, parameters, and schemas. This specification:
A streamlined subset of the API specifically optimized for AI agent interaction. This specification:
The MCP version is ideal for integration with AI assistants and tools that need to interact with Fastly through natural language interfaces.
You can generate interactive documentation from these specifications using tools like:
Examples:
# For the complete API npx @redocly/cli preview-docs fastly-openapi.yaml # For the AI-optimized subset npx @redocly/cli preview-docs fastly-openapi-mcp.yaml
To validate the specifications:
# For the complete API npx @stoplight/spectral-cli lint fastly-openapi.yaml # Or use swagger-cli npx swagger-cli validate fastly-openapi.yaml # For the AI-optimized subset npx @stoplight/spectral-cli lint fastly-openapi-mcp.yaml
These specifications can be used with OpenAPI code generators to create client libraries in various programming languages:
# For the complete API npx @openapitools/openapi-generator-cli generate -i fastly-openapi.yaml -g javascript -o ./client # For the AI-optimized subset npx @openapitools/openapi-generator-cli generate -i fastly-openapi-mcp.yaml -g javascript -o ./client-mcp
Both specifications are designed to be "agent-ready" - optimized for use with AI agents and tools. They follow best practices for machine readability:
The MCP version takes agent-readiness even further with:
For specific use cases for the MCP specification, see subset.md which outlines common conversational tasks and the corresponding API workflows.
To use the MCP server with your AI assistant configuration:
{ "mcpServers": { "fastly api": { "command": "bunx", "args": ["fastly-mcp-server@latest", "run"], "env": { "API_KEY_APIKEYAUTH": "your-fastly-api-key" } } } }
{ "mcpServers": { "fastly": { "command": "npx", "args": ["-y", "fastly-mcp-server@latest", "run"], "env": { "API_KEY_APIKEYAUTH": "your-fastly-api-key" } } } }
Note: Bun is the preferred runtime for fastly-mcp-server due to its superior performance and startup time.
See the fastly-mcp-server documentation for more details on configuration and usage.