PaellaDoc AI Development Framework
STDIOAI-First Development framework implementing Model Context Protocol for software creation.
AI-First Development framework implementing Model Context Protocol for software creation.
Perfect AI development, like perfect paella: quality ingredients, structure, and expertise.
⭐ If you find PAELLADOC useful, please consider starring the repo! ⭐
Version 0.3.7: Hotfix release restoring core project CRUD tools inadvertently omitted in v0.3.6 build. Check the CHANGELOG for details!
"In the AI era, context isn't supplementary to code—it's the primary creation."
PAELLADOC is an AI-First Development framework that implements the 5 Philosophical Principles of AI-First Development, transforming how we create software in the age of AI.
PAELLADOC implements Anthropic's Model Context Protocol (MCP) (see Anthropic's news). This protocol provides a structured way for Large Language Models (LLMs) to interact with external tools and context, enabling more sophisticated capabilities.
By implementing MCP, PAELLADOC allows LLMs to leverage its specific AI-First development tools and workflows directly through this standard. This approach facilitates functionalities similar to Tool Use or Function Calling seen in other platforms, but specifically utilizes the Anthropic MCP standard for interaction.
Traditional development treats documentation as an afterthought. AI-First Development inverts this paradigm:
# Traditional Way write_code() -> document() # PAELLADOC Way create_context() -> manifest_as_code()
graph TD A[Business Intent] --> B[Context Creation] B --> C[Architecture Manifestation] C --> D[Code Generation] D --> E[Living Documentation]
# Knowledge evolves with your system paella continue my-project
# Not just code generation, but true collaboration with paelladoc.context() as ctx: ctx.understand_intent() ctx.propose_solutions() ctx.implement_with_human()
decision: id: uuid-123 intent: "Why we chose this path" context: "What we knew at the time" alternatives: "What we considered" implications: "Future impact"
PAELLADOC is a Python application and should be installed in its own dedicated Python virtual environment. This keeps its dependencies separate and avoids conflicts. You'll need one PAELLADOC environment, regardless of how many different projects (Python, JS, Ruby, etc.) you plan to document.
(Requires Python 3.12 or later)
To install PAELLADOC for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @jlcases/paelladoc --client claude
First, choose a permanent location for this environment. Your home directory is often a good choice.
# Navigate to where you want to store the environment (e.g., your home directory) # cd ~ # Uncomment and run if you want it in your home directory # Create the virtual environment (using python3.12 or your installed 3.12+ version) # We'll name the folder '.paelladoc_venv' (starting with a dot makes it hidden) python3.12 -m venv .paelladoc_venv # Activate the environment # (The command depends on your shell. Use ONE of the following) # For Bash/Zsh: source .paelladoc_venv/bin/activate # For Fish: # source .paelladoc_venv/bin/activate.fish # For Powershell (Windows): # .\.paelladoc_venv\Scripts\activate.ps1
(You should see (.paelladoc_venv)
at the beginning of your terminal prompt now)
# Make sure your (.paelladoc_venv) prompt is visible before running pip pip install paelladoc
PAELLADOC needs to know where to store its memory database (memory.db
). There are two main ways to configure this:
Option 1: Environment Variable (Less Reliable for LLM Integration)
You can set the PAELLADOC_DB_PATH
environment variable. This works well if you run PAELLADOC directly from your terminal.
# Example: Set the variable in your current terminal session export PAELLADOC_DB_PATH="$HOME/.paelladoc/memory.db" # Optional: Add the export line to your shell's startup file # (.bashrc, .zshrc, etc.) for it to persist across sessions.
Important: When PAELLADOC is run by an LLM tool (like Cursor via MCP), it might not inherit environment variables set this way. Therefore, this method is less reliable for LLM integration.
Option 2: MCP Configuration (Recommended for LLM Integration)
The most reliable way to ensure your LLM tool uses the correct database path is to configure it directly within the tool's MCP JSON file (.cursor/mcp.json
for Cursor). This injects the variable directly into the server process launched by the LLM.
See the examples in the next section.
Now, tell your LLM tool (like Cursor) how to find and run PAELLADOC.
Key Information Needed:
python
inside your .paelladoc_venv
.Edit your .cursor/mcp.json
file. Add a server configuration for PAELLADOC. Here's a typical example:
{ "mcpServers": { "Paelladoc": { "command": "/absolute/path/to/.paelladoc_venv/bin/python", "args": [ "-m", "paelladoc.ports.input.mcp_server_adapter", "--stdio" ], "cwd": "/path/to/your/project/directory", // Optional: Set working directory "env": { // Recommended for local dev: Use a DB in your project folder "PAELLADOC_DB_PATH": "/path/to/your/project/directory/paelladoc_memory.db", // Optional: Add src to PYTHONPATH if needed for local development imports "PYTHONPATH": "/path/to/your/project/directory/src:/path/to/your/project/directory" }, "disabled": false } }, "mcp.timeout": 120000 }
Important Notes:
command
path must be the absolute path to the Python executable inside your .paelladoc_venv
(created in Step 1). Replace /absolute/path/to/
with the actual path on your system (e.g., /Users/your_username/
).PAELLADOC_DB_PATH
is not set in env
), PAELLADOC uses ~/.paelladoc/memory.db
.PAELLADOC_DB_PATH
in the env
section (as shown in the example) is the recommended and most reliable approach. Replace /path/to/your/project/directory/
with the actual path to your project.cwd
): Setting this to your project directory can be helpful but is often optional.env
might be necessary if you are doing local development on PAELLADOC itself and need the server to find your source code.Once connected, your LLM will have access to all PAELLADOC commands:
PAELLA
: Start new documentation projectsCONTINUE
: Continue existing documentationVERIFY
: Verify documentation coverageGENERATE
: Generate documentation or codeThe LLM will handle all the complexity - you just need to express your intent in natural language!
pip install paelladoc
) are stable releases recommended for general use.main
branch (and other branches) on the GitHub repository contains the latest development code. This version may include new features or changes that are not yet fully tested and should be considered unstable. Use this version if you want to try out cutting-edge features or contribute to development.Note on Current Development: Active development is currently focused internally on delivering an MVP with significant new capabilities. While the PyPI version remains stable, expect major advancements in future releases as we work towards this goal in a more private setting for now.
Ensure PAELLADOC is installed (pip install paelladoc
) and configured in your LLM's tool/MCP settings (see examples above).
Start interacting with PAELLADOC through your LLM by issuing a command. The primary command to initiate a new project or list existing ones is PAELLA
.
PAELLA
Use PAELLADOC to start documenting a new project.
Tell PAELLADOC I want to create documentation.
Follow the LLM's lead: PAELLADOC (via the LLM) will then guide you through the process interactively, asking for project details, template choices, etc.
This version provides the following core commands, exposed via MCP for interaction with your LLM:
ping
:
random_string
).{ "status": "ok", "message": "pong" }
.paella_init
:
base_path
(str), documentation_language
(str, e.g., "es-ES"), interaction_language
(str, e.g., "en-US"), new_project_name
(str).paella_list
:
projects
).paella_select
:
project_name
(str).core_continue
:
project_name
(str).core_help
:
core_list_projects
:
paella_list
) Lists the names of existing PAELLADOC projects.db_path
(str, optional, for testing).projects
).core_verification
:
Based on the Unified Roadmap, future versions aim to include:
GENERATE-DOC
).GENERATE_CONTEXT
).code_generation
).styles.coding_styles
, styles.git_workflows
).DECISION
, ISSUE
, ACHIEVEMENT
).Our AI-First taxonomy ensures complete context preservation: