
LLM Based OS
STDIOHTTP-SSESecure bridge connecting LLMs with local files, mail, sync, and agents via MCP protocol.
Secure bridge connecting LLMs with local files, mail, sync, and agents via MCP protocol.
A local-first runtime for tool-using AI agents
LLMbasedOS lets you run AI agents on your own machine with explicit, narrow permissions. Everything runs locally by default. No background monitoring. No hidden capabilities. You decide which tools exist, what they can see, and when they run.
Most people want help from AI without giving up privacy or control. LLMbasedOS provides a small, observable runtime that executes on your computer, inside containers, with clear boundaries. If a tool is not enabled, it does not exist. If a folder is not mounted, it is invisible.
Think of MCP as a toolbox with labeled drawers. Each drawer is a tool with a clear contract: a name, inputs, and outputs. An agent cannot invent new drawers. When you type a command, the agent asks the gateway to open a specific drawer. The gateway logs the request, forwards it to the tool process, and returns only the defined result. Since the container mounts only the paths you choose, a file tool can read or write only there. New powers appear only if you add and enable a new tool and mount extra paths on purpose.
Not included by default: email sender, host shell executor, screen or keyboard access, system installers, network scanners.
Download the release .zip and unzip it
Open a terminal in the unzipped folder
Start the stack:
docker compose -f docker-compose.pc.yml up -d --build
Pull a local model (first time only):
docker exec -it llmbasedos_ollama ollama pull gemma:2b
Open the interactive console:
docker exec -it llmbasedos_pc python -m llmbasedos_src.shell.luca
You should see a prompt like: / [default_session] luca>
Exit with exit
Stop the stack when you are done:
docker compose -f docker-compose.pc.yml down
You control the scope at the container boundary. These settings are simple and effective.
In docker-compose.pc.yml
, change ./data:/data:rw
to ./data:/data:ro
.
In .env
, set LLM_PROVIDER_BACKEND=ollama
and do not add cloud API keys.
Set network_mode: "none"
on the paas service if you want a fully offline run after the model is pulled.
supervisord.conf
. If a tool is not listed, it does not runNo. It only sees what you mount. The demo mounts the bundled data folder.
No. There is no background monitoring. Agents act when you ask them to. Scheduled jobs are off by default and must be created explicitly.
Not unless you opt-in to a cloud model or enable a network tool. The default setup uses a local model with no external API calls.
No. There is no host shell tool in the default setup. Adding such a tool would require an explicit change and is not recommended for sensitive machines.
Use Docker Desktop or docker compose down
. The stack stops and your computer is unchanged outside the mounted folder.
Questions or concerns: open an issue on GitHub or email [email protected].