
OWL
STDIOMCP server for creating and managing OWL ontologies through AI assistants
MCP server for creating and managing OWL ontologies through AI assistants
OWL-MCP is a Model-Context-Protocol (MCP) server for working with Web Ontology Language (OWL) ontologies.
This walks you through using owl-mcp with Goose, but any MCP-enabled AI host will work.
You can use either the Desktop or CLI version of Goose from here:
Follow the instructions for setting up an LLM provider (Anthropic recommended)
You can either install directly from this link:
Or to do this manually, in the Extension section of Goose, add a new entry for owlmcp:
uvx owl-mcp
This video shows how to do this manually:
You can ask to create an ontology, and add axioms to an ontology:
The MCP server provides function calls for finding, adding, or removing OWL axioms, using OWL functional syntax. Each function call is accompanied by the file path of the OWL file on your disk. Any format supported by py-horned-owl is accepted (we following OBO guidelines and recommend functional syntax for source).
The server takes care of keeping an instance of the ontology in memory and syncing it with disk. Any CRUD operation simultaneously updates the in-memory model and syncs this with disk. If you have Protege running, Protege will also sync with local disk, and show updates.
The server is well adapted for working with OBO-style ontologies - when OWL strings are sent back to the client, labels for opaque IDs are included after #
s comments, as is common for obo-format.