icon for mcp server

Saiki

STDIO

Lightweight runtime for creating AI agents that convert natural language into real-world actions

Saiki

A lightweight runtime for creating and running AI agents that turn natural language into real-world actions.

Saiki Demo

Table of Contents

  1. Why Saiki?
  2. Installation
  3. Run Modes
  4. Quick Start
  5. Programmatic API
  6. Configuration
  7. Examples & Demos
  8. Capabilities
  9. LLM Providers
  10. Standalone MCP Manager
  11. CLI Reference
  12. Next Steps
  13. Community & Support
  14. Contributors
  15. License

Why Saiki?

Saiki is the missing intelligence layer of your stack—perfect for building AI applications, standalone chatbots, or as the reasoning engine inside larger products.

The main Saiki features are:

💡 FeatureWhat it means for you
Powerful CLI and Web UISaiki ships with a powerful CLI and Web UI that enable you to run AI agents in your terminal and over the web.
Single runtime, many interfacesRun the same agent via CLI, Web, Discord, Telegram, or a REST/WS server.
Model-agnosticHot-swap LLMs from OpenAI, Anthropic, Gemini, Groq, or local models.
Unified ToolingConnect to remote tool servers (filesystem, browser, web-search) via the Model Context Protocol (MCP).
Config-drivenDefine agent behavior (prompts, tools, model, memory) in version-controlled YAML.
Production-ready CoreLeverage a multi-session chat manager, typed API, pluggable storage, and robust logging.
ExtensibleShip your own MCP tool servers or plug in custom services with a few lines of config.
Multi-Agent SystemsEnable multi-agent collaboration via MCP and A2A.

Installation

# NPM global npm install -g @truffle-ai/saiki # —or— build from source git clone https://github.com/truffle-ai/saiki.git cd saiki && npm i && npm run build && npm link

Run Modes

ModeCommandBest for
Interactive CLIsaikiEveryday automation & quick tasks
Web UIsaiki --mode webFriendly chat interface w/ image support
Headless Serversaiki --mode serverREST & WebSocket APIs for agent interaction
MCP Server (Agent)saiki --mode mcpExposing your agent as a tool for others via stdio
MCP Server (Aggregator)saiki mcp --group-serversRe-exposing tools from multiple MCP servers via stdio
Discord Botsaiki --mode discordCommunity servers & channels (Requires Setup)
Telegram Botsaiki --mode telegramMobile chat (Requires Setup)

Run saiki --help for all flags, sub-commands, and environment variables.


Quick Start

Set your API keys first:

export OPENAI_API_KEY=your_openai_api_key_here

Then, give Saiki a multi-step task that combines different tools:

saiki "create a new snake game in html, css, and javascript, then open it in the browser"

Saiki will use its filesystem tools to write the code and its browser tools to open the index.html file—all from a single prompt.

Then start the Web UI:

saiki --mode web

The Web UI will load up any previous conversations you had, and also allows you to experiment with different models and MCP servers.


Programmatic API

The SaikiAgent class is the core of the runtime. The following example shows its full lifecycle: initialization, running a single task, holding a conversation, and shutting down.

import 'dotenv/config'; import { SaikiAgent, loadConfigFile } from '@truffle-ai/saiki'; const cfg = await loadConfigFile('./agents/agent.yml'); const agent = new SaikiAgent(cfg); await agent.start(); // Single-shot task console.log(await agent.run('List the 5 largest files in this repo')); // Conversation await agent.run('Write a haiku about TypeScript'); await agent.run('Make it funnier'); agent.resetConversation(); await agent.stop();

Everything in the CLI is powered by this same class—so whatever the CLI can do, your code can too.

Check out our Typescript SDK docs for a complete guide.


Configuration

Agents are defined in version-controlled YAML. A minimal example:

mcpServers: filesystem: type: stdio command: npx args: ['-y', '@modelcontextprotocol/server-filesystem', '.'] puppeteer: type: stdio command: npx args: ['-y', '@truffle-ai/puppeteer-server'] llm: provider: openai model: gpt-4o apiKey: $OPENAI_API_KEY systemPrompt: | You are Saiki, an expert coding assistant...

Change the file, reload the agent, and chat—the conversation state, memory, and tools will update.

Check out our Configuration guide for the complete reference.


Examples & Demos

🛒 Amazon Shopping Assistant

Task: Can you go to amazon and add some snacks to my cart? I like trail mix, cheetos and maybe surprise me with something else?

# Default agent has browser tools saiki
Saiki: Amazon shopping agent demo

📧 Send Email Summaries to Slack

Task: Summarize emails and send highlights to Slack

saiki --agent ./agents/examples/email_slack.yml
Email to Slack Demo

More ready-to-run recipes live in agents/examples and the docs site.


Capabilities

  • Dynamic LLM Switching: Change model, provider, or routing rules mid-conversation.
  • Streaming Responses: Opt-in to receive tokens as they arrive for real-time output.
  • Multi-Session Management: Create isolated, stateful chat sessions (think workspace tabs).
  • Pluggable Memory Backends: Use the in-memory default or connect your own DB via the StorageManager.
  • Lifecycle Event Bus: Subscribe to agent events for metrics, logging, or custom side-effects.
  • Standalone MCP Manager: Use Saiki's core MCPManager in your own projects without the full agent.

LLM Providers

Saiki supports multiple LLM providers out-of-the-box, plus any OpenAI SDK-compatible provider.

  • OpenAI: gpt-4.1-mini, gpt-4o, o3, o1 and more
  • Anthropic: claude-4-sonnet-20250514, claude-3-7-sonnet-20250219, and more
  • Google: gemini-2.5-pro, gemini-2.0-flash and more
  • Groq: llama-3.3-70b-versatile, gemma-2-9b-it

Quick Setup

Set your API key and run. You can switch providers instantly via the -m flag.

# OpenAI (default) export OPENAI_API_KEY=your_openai_api_key_here export ANTHROPIC_API_KEY=your_anthropic_api_key_here export GOOGLE_GENERATIVE_AI_API_KEY=your_google_gemini_api_key_here saiki # Switch providers via CLI saiki -m claude-3.5-sonnet-20240620 saiki -m gemini-1.5-flash-latest

For comprehensive setup instructions, see our LLM Providers Guide.


Standalone MCP Manager

Need to manage MCP tool servers without the full agent? Use the MCPManager directly in your own applications.

import { MCPManager } from '@truffle-ai/saiki'; // Create manager instance const manager = new MCPManager(); // Connect to MCP servers await manager.connectServer('filesystem', { type: 'stdio', command: 'npx', args: ['-y', '@modelcontextprotocol/server-filesystem', '.'] }); // Get all available tools across servers const tools = await manager.getAllTools(); console.log('Available tools:', Object.keys(tools)); // Execute a tool const result = await manager.executeTool('readFile', { path: './README.md' }); console.log('File contents:', result); // Disconnect when done await manager.disconnectAll();

See the MCP Manager Documentation for the complete API reference.


CLI Reference

Click to expand for full CLI reference (`saiki --help`)
Usage: saiki [options] [command] [prompt...]

The Saiki CLI allows you to talk to Saiki, build custom AI Agents, and create complex AI applications. For full documentation, visit https://github.com/truffle-ai/saiki.

Arguments:
  prompt                    Natural-language prompt to run once. If empty, starts interactive CLI.

Options:
  -v, --version             output the current version
  -a, --agent <path>        Path to agent config file (default: "agents/agent.yml")
  -s, --strict              Require all server connections to succeed
  --no-verbose              Disable verbose output
  -m, --model <model>       Specify the LLM model to use.
  -r, --router <router>     Specify the LLM router to use (vercel or in-built)
  --mode <mode>             Runtime mode: cli | web | server | discord | telegram | mcp (default: "cli")
  --web-port <port>         Optional port for the web UI (default: "3000")
  -h, --help                display help for command

Commands:
  create-app                Scaffold a new Saiki Typescript app.
  init-app                  Initialize an existing Typescript app with Saiki.
  mcp                       Run Saiki as an MCP server.

Next Steps


Contributing

We welcome contributions! Refer to our Contributing Guide for more details.

Community & Support

Saiki is built by the team at Truffle AI.
Join our Discord to share projects, ask questions, or just say hi!

Discord

If you enjoy Saiki, please give us a ⭐ on GitHub—it helps a lot!

Twitter Follow Twitter Follow


Contributors

Thanks to all these amazing people for contributing to Saiki!

Contributors


License

Elastic License 2.0. See LICENSE for full terms.

Be the First to Experience MCP Now