
Bee
STDIOMCP server for accessing and managing Bee wearable lifelogging data
MCP server for accessing and managing Bee wearable lifelogging data
Unofficial Model Context Protocol (MCP) server for interacting with your Bee wearable lifelogging data. More context on my blog.
This server acts as a bridge, allowing Large Language Models (LLMs) like Claude or custom AI agents to access and interact with your personal data stored in Bee, including conversations, facts, to-dos, and location history.
Disclaimer: This is an unofficial project and is not affiliated with Bee. Use it at your own risk. Ensure you understand the security implications of granting AI access to your personal data via the API key.
Bee.computer helps you capture moments from your life (conversations, places visited, notes). beemcp
makes this data available to your AI assistant through the Model Context Protocol. This means you can ask your AI questions like:
The AI, using beemcp
, can securely fetch or modify this information from your Bee.computer account.
You can install and run beemcp
using uv
(recommended) or pip
.
uv is a fast Python package installer and resolver. If you have uv
installed, you don't need to install beemcp
separately. You can run it directly using uvx
:
# Example of running directly (requires API key configured, see below) uvx beemcp
Alternatively, you can install beemcp
using pip
:
pip install beemcp
After installation, you can run it as a Python module:
python -m beemcp.beemcp
Or, if the entry point is correctly added to your system's PATH during installation, you might be able to run it directly:
beemcp
beemcp
requires your Bee API key to function. Never share this key publicly or commit it to version control.
Get your API key from the Bee developer website here.
If running in Claude or another MCP client, you will likely provide the BEE_API_TOKEN
environment variable in the client's configuration.
If running directly from the command line, provide the key is using a .env
file in the directory where you run the beemcp
server:
Create a file named .env
in the same directory you intend to run beemcp
from.
Add the following line to the .env
file, replacing your_actual_bee_api_key_here
with your real key:
BEE_API_TOKEN="your_actual_bee_api_key_here"
Alternatively, you can set the BEE_API_TOKEN
environment variable directly in your system or shell:
export BEE_API_TOKEN="your_actual_bee_api_key_here" # Now run the server in the same shell session uvx beemcp
The server will exit with an error if the BEE_API_TOKEN
is not found.
You need to tell your LLM client (like Claude.app or Zed) how to start and communicate with the beemcp
server.
Add the following to your Claude settings (settings.json
):
Using uvx (Recommended):
"mcpServers": { "beemcp": { "command": "uvx", "args": ["beemcp"], "env": {"BEE_API_TOKEN": "<YOUR API KEY HERE>"} } }
"mcpServers": { "bee": { "command": "python", "args": ["-m", "beemcp.beemcp"], "env": {"BEE_API_TOKEN": "<YOUR API KEY HERE>"} } }
If you go to the Settings
window in Claude Desktop and open the Developer
tab, you should see something like this:
Add the following to your Zed settings.json
:
"context_servers": [ { "name": "beemcp", "command": "uvx", "args": ["beemcp"], "env": {"BEE_API_TOKEN": "<YOUR API KEY HERE>"} } ],
"context_servers": [ { "name": "beemcp", "command": "python", "args": ["-m", "beemcp.beemcp"], "env": {"BEE_API_TOKEN": "<YOUR API KEY HERE>"} } ],
These are the actions the LLM can request from beemcp
.
list-all-conversations
get-conversation
id
(integer): The ID of the conversation.list-all-user-facts
get-user-fact
id
(integer): The ID of the fact.record-user-fact
text
(string): The content of the fact.update-user-fact
id
(integer): The ID of the fact to update.text
(string): The new content for the fact.confirmed
(boolean): Whether the user has confirmed this fact.confirm-user-fact
id
(integer): The ID of the fact to confirm/unconfirm.confirmed
(boolean): The new confirmation status.delete-user-fact
id
(integer): The ID of the fact to delete.list-all-todos
list-incomplete-todos
create-todo
alarm_at
to an ISO 8601 formatted date-time string if the todo has a specific deadline or reminder time.text
(string): The content of the todo.alarm_at
(string, optional): ISO 8601 datetime string (e.g., "2024-12-31T23:59:00Z").update-todo
id
(integer): The ID of the todo to update.text
(string, optional): New text for the todo.completed
(boolean, optional): New completion status.alarm_at
(string, optional): New ISO 8601 alarm time.delete-todo
id
(integer): The ID of the todo to delete.mark-todo-completed
id
(integer): The ID of the todo to mark as complete.Some of these convenience functions are redundant but are added to make usage with current large language models more practical
list-all-locations
get-locations-today
get-locations-week
get-locations-month
get-locations-by-time
start_time
and end_time
should be ISO 8601 formatted date-time strings, in this format: 2025-12-31T00:00:00Zstart_time
(string, optional): Start time in ISO 8601 format.end_time
(string, optional): End time in ISO 8601 format.MCP Resources provide direct access to data, often used for context or caching by the LLM client. many LLm clients do not support resources very well, so the "Tools" listed above are provided even when they may be redundant.
bee://conversations
: List summaries of all conversations.bee://conversations/{id}
: Get full details for a specific conversation.bee://facts
: List summaries of all confirmed facts.bee://facts/{id}
: Get full details for a specific fact.bee://todos
: List summaries of all todos.bee://todos/incomplete
: List summaries of incomplete todos.bee://todos/{id}
: Get full details for a specific todo.bee://locations
: List summaries of all recorded locations (combined sequentially).bee://locations/today
: List locations from the last 24 hours.bee://locations/week
: List locations from the last 7 days.bee://locations/month
: List locations from the last 30 days.After which the Bee app on your phone will suggest the fact:
You can use the MCP inspector tool (@modelcontextprotocol/inspector
) to interact with and debug the beemcp
server directly.
If you installed using uv
and are running with uvx
:
npx @modelcontextprotocol/inspector uvx beemcp
If you installed using pip
:
npx @modelcontextprotocol/inspector python -m beemcp.beemcp
If you are developing locally within the project directory:
# Assuming you are in the root directory of the beemcp project npx @modelcontextprotocol/inspector python -m beemcp.beemcp # Or if using uv for development npx @modelcontextprotocol/inspector uv run beemcp.beemcp
Try asking your AI assistant questions like these to leverage beemcp
:
record-user-fact
)update-user-fact
)beemcp
is licensed under the MIT License. You are free to use, modify, and distribute this software under the terms of the license. See the LICENSE
file (or the standard MIT license text) for details.