
LLM Based OS
HTTP-SSESTDIOSecure bridge connecting LLMs with local files, mail, sync, and agents via MCP protocol.
Secure bridge connecting LLMs with local files, mail, sync, and agents via MCP protocol.
llmbasedos
is a system designed to expose local capabilities (files, mail, sync, agents) to various "host" applications (LLM frontends, VS Code plugins, etc.) via the Model Context Protocol (MCP). It serves as a secure and standardized bridge between Large Language Models and your personal data and tools.
Primarily deployed via Docker, llmbasedos
can also be built as a minimal Arch Linux based ISO for dedicated appliances.
The system is composed of several key Python components, typically running within a single Docker container managed by Supervisord:
Gateway (llmbasedos_pkg/gateway/
):
/etc/llmbasedos/lic.key
, tiers from /etc/llmbasedos/licence_tiers.yaml
), authorization, and rate limiting./run/mcp/*.cap.json
files.mcp.llm.chat
to configured LLMs (OpenAI, llama.cpp, etc., defined in AVAILABLE_LLM_MODELS
in gateway config), applying quotas.MCP Servers (llmbasedos_pkg/servers/*/
):
llmbasedos.mcp_server_framework.MCPServer
base class.SERVICE_NAME.cap.json
to /run/mcp/
for discovery by the gateway.servers/fs/
): File system operations (list, read, write, delete, semantic embed/search via SentenceTransformers/FAISS). Path access is confined within a configurable "virtual root" (e.g., /mnt/user_data
in Docker). FAISS index stored in a persistent volume.servers/sync/
): Wrapper for rclone
for file synchronization tasks. Requires rclone.conf
.servers/mail/
): IMAP client for email access and iCalendar parsing. Accounts configured in /etc/llmbasedos/mail_accounts.yaml
.servers/agent/
): Executes agentic workflows defined in YAML files (from /etc/llmbasedos/workflows
), potentially interacting with Docker (if Docker-in-Docker setup or socket passthrough) or HTTP services.Shell (llmbasedos_pkg/shell/
):
luca-shell
: An interactive Python REPL (using prompt_toolkit
) that runs on your host machine (or wherever you need a client).ls
, cat
, or direct MCP calls) to the gateway.luca-shell
) to Gateway: WebSocket (e.g., ws://localhost:8000/ws
)./run/mcp/fs.sock
) with JSON messages delimited by \0
..env
file with docker-compose
) and are not part of the image.lic.key
, mail_accounts.yaml
, rclone.conf
) are mounted as read-only volumes into the container.docker compose
CLI v2).gateway/
, servers/
, shell/
, mcp_server_framework.py
, common_utils.py
) is inside a top-level directory (e.g., llmbasedos_src/
) within your project root. This llmbasedos_src/
directory will be treated as the llmbasedos
Python package inside the Docker image.docker-compose.yml
):
.env
: Define OPENAI_API_KEY
and other environment variables (e.g., LLMBDO_LOG_LEVEL
).lic.key
: Example: FREE:youruser:2025-12-31
mail_accounts.yaml
: For mail server accounts.gateway/licence_tiers.yaml
: To define licence tiers (if you want to override defaults that might be in gateway/config.py
)../workflows/
directory and add your agent workflow YAML files../user_files/
directory and add any files you want the FS server to access.supervisord.conf
is present and correctly configured (especially the directory
for each program).docker compose build
docker compose up
ws://localhost:8000/ws
(or the port configured via LLMBDO_GATEWAY_EXPOSED_PORT
in .env
).luca-shell
from your host machine (ensure its Python environment has dependencies from llmbasedos_src/shell/requirements.txt
installed):
# From project root, assuming venv is activated python -m llmbasedos_src.shell.luca
luca-shell
, type connect
(if not auto-connected), then mcp.hello
.docker compose build
(needed if Dockerfile
or requirements.txt
files change).llmbasedos_src/
directory.docker-compose.yml
is set up to mount ./llmbasedos_src
into /opt/app/llmbasedos
in the container.docker compose restart llmbasedos_instance # OR, for specific service restart: # docker exec -it llmbasedos_instance supervisorctl restart mcp-gateway
supervisord.conf
, licence_tiers.yaml
, etc.), a docker-compose restart llmbasedos_instance
is also sufficient.The iso/
directory contains scripts for building a bootable Arch Linux ISO. This is a more complex deployment method, with Docker being the preferred route for most use cases. (Refer to older README versions or iso/build.sh
for details if needed).
MCPServer
framework in llmbasedos_pkg/mcp_server_framework.py
for all backend servers (fs
, sync
, mail
, agent
), standardizing initialization, MCP method registration, socket handling, and capability publishing.llmbasedos_src/
on host, becoming llmbasedos
package in Docker) for cleaner imports and module management.gateway/main.py
) updated to use FastAPI's lifespan
manager for startup/shutdown events.shell/luca.py
) refactored into ShellApp
class for better state and connection management.websockets
, logging.config
).gateway/licence_tiers.yaml
) and mail accounts (mail_accounts.yaml
) externalized.HF_HOME
for fs_server
to resolve permission issues.jsonschema
dependency for MCP parameter validation within MCPServer
framework.supervisord.conf
now correctly sets working directories and includes sections for supervisorctl
interaction.Dockerfile
optimized with multi-stage builds and correct user/permission setup.docker-compose.yml
configured for easy launch, volume mounting (including live code mounting for development), and environment variable setup.