
Fabric
STDIOSTREAMABLE HTTPBridge connecting Fabric AI framework to Model Context Protocol (MCP) compatible applications.
Bridge connecting Fabric AI framework to Model Context Protocol (MCP) compatible applications.
main | develop | |||
---|---|---|---|---|
Connect the power of the Fabric AI framework to any Model Context Protocol (MCP) compatible application.
This project implements a standalone server that bridges the gap between Daniel Miessler's Fabric framework and the Model Context Protocol (MCP). It allows you to use Fabric's patterns, models, and configurations directly within MCP-enabled environments like IDE extensions or chat interfaces.
Imagine seamlessly using Fabric's specialized prompts for code explanation, refactoring, or creative writing right inside your favorite tools!
fabric --serve
).fabric_run_pattern
) via MCP's list_tools()
mechanism.fabric --serve
instance.fabric --serve
instance executes the pattern.This project is currently in the implementation phase.
The core architecture and proposed tools are outlined in the High-Level Design Document.
# you can also use pnpm if you prefer npm install -g task-master-ai
And occasionally you should upgrade it:
# or use "pnpm upgrade -g task-master-ai" npm upgrade -g task-master-ai
Read the Task Master docs for how to set up your .env
file with the appropriate API keys.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
Clone the repository:
git clone https://github.com/ksylvan/fabric-mcp.git cd fabric-mcp
Install dependencies using uv sync:
uv sync --dev
This command ensures your virtual environment matches the dependencies in pyproject.toml
and uv.lock
, creating the environment on the first run if necessary.
Activate the virtual environment (uv will create it if needed):
On macOS/Linux:
source .venv/bin/activate
On Windows:
.venv\Scripts\activate
Now you have the development environment set up!
If you just want to use the fabric-mcp
server without developing it, you can install it directly from PyPI:
# Using pip pip install fabric-mcp # Or using uv uv pip install fabric-mcp
This will install the package and its dependencies. You can then run the server using the fabric-mcp
command.
The fabric-mcp
server can be configured using the following environment variables:
FABRIC_BASE_URL
: The base URL of the running Fabric REST API server (fabric --serve
).
http://127.0.0.1:8080
FABRIC_API_KEY
: The API key required to authenticate with the Fabric REST API server, if it's configured to require one.
FABRIC_MCP_LOG_LEVEL
: Sets the logging verbosity for the fabric-mcp
server itself.
DEBUG
, INFO
, WARNING
, ERROR
, CRITICAL
(case-insensitive).INFO
You can set these variables in your shell environment (or put them into a .env
file in the working directory) before running fabric-mcp
:
export FABRIC_BASE_URL="http://your-fabric-host:port" # This must match the key used by fabric --serve export FABRIC_API_KEY="your_secret_api_key" export FABRIC_MCP_LOG_LEVEL="DEBUG" # Standard I/O transport (default) fabric-mcp --stdio # HTTP Streamable transport for HTTP-based MCP clients fabric-mcp --http-streamable # Custom host/port for HTTP transport fabric-mcp --http-streamable --host 0.0.0.0 --port 3000 --mcp-path /message
The fabric-mcp
server supports multiple transport methods:
--stdio
: Standard I/O transport for direct MCP client integration (default)--http-streamable
: HTTP-based transport that runs a full HTTP server for MCP communication
--host
: Server bind address (default: 127.0.0.1)--port
: Server port (default: 8000)--mcp-path
: MCP endpoint path (default: /message)For more details on transport configuration, see the Infrastructure and Deployment Overview.
Feedback on the design document is highly welcome! Please open an issue to share your thoughts or suggestions.
Read the contribution document here and please follow the guidelines for this repository.
Also refer to the cheat-sheet for contributors which contains a micro-summary of the development workflow.
Copyright (c) 2025, Kayvan Sylvan Licensed under the MIT License.