InfraNodus
STDIOHTTP-SSE集成InfraNodus知识图谱和文本网络分析的MCP服务器
集成InfraNodus知识图谱和文本网络分析的MCP服务器
A Model Context Protocol (MCP) server that integrates InfraNodus knowledge graph and text network analysis capabilities into LLM workflows and AI assistants like Claude Desktop.
InfraNodus MCP Server enables LLM workflows and AI assistants to analyze text using advanced network science algorithms, generate knowledge graphs, detect content gaps, and identify key topics and concepts. It transforms unstructured text into structured insights using graph theory and network analysis.

generate_knowledge_graph
analyze_existing_graph_by_name
generate_content_gaps
generate_topical_clusters
generate_contextual_hint
generate_research_questions
generate_research_ideas
research_questions_from_graph
generate_responses_from_graph
develop_conceptual_bridges
develop_latent_topics
develop_text_tool
create_knowledge_graph
overlap_between_texts
difference_between_texts
analyze_google_search_results
analyze_related_search_queries
search_queries_vs_search_results
generate_seo_report
memory_add_relations
memory_get_relations
search
fetch
More capabilites coming soon!
InfraNodus represents any text as a network graph in order to identify the main clusters of ideas and gaps between them. This helps generate advanced insights based on the text's structure. The network is effectively a knowledge graph that can also be used to retrieve complex ontological relations between different entities and concepts. This process is automated in InfraNodus using the search and fetch tools along with the other tools that analyze the underlying network.
However, you can also easily use InfraNodus as a more traditional memory server to save and retrieve relations. We use [[wikilinks]] to highlight entities in your text to make your content and graphs compatible with markup syntax and PKM tools such as Obsidian. By default, InfraNodus will generate the name of the memory graph for you based on the context of the conversation. However, you can modify this default behavior by adding a system prompt or project instruction into your LLM client.
Specifically you can specify to always use a speciic knowlege graph for memories to store everything in one place:
Save all memories in the `my-memories` graph in InfraNodus.
Or you can ask InfraNodus to only save certain entities, e.g. for building social networks:
When generating entities, only extract people, companies, and organizations. Ignore everything else.
The easiest and the fastest way to launch the InfraNodus MCP server is to use the external provider, Smithery, and simply copy and paste the settings to the tool of your choice (e.g. Claude, Cursor, or ChatGPT).
You can also install the server locally, so you have more control over it. In this case, you can also edit the source files and even create your tools based on the InfraNodus API.
Below we describe the two different ways to set up your InfraNodus MCP server.
Once you add the URL above to your tool, it will automatically prompt you to authenticate using Smithery (via Oauth) in order to be able to access the InfraNodus MCP hosted on it.
If your client does not support Oauth, you can click the link *Get the URL with keys instead** which you can use to authenticate without Oauth.
In the end, if you use the URL with the keys, either Smithery or you yourself will add something like this in your MCP configuration file:
// e.g. Cursor will access directly the server via Smithery "mcpServers": { "mcp-server-infranodus": { "type": "http", "url": "https://server.smithery.ai/@infranodus/mcp-server-infranodus/mcp?api_key=YOUR_SMITHERY_KEY&profile=YOUR_SMITHERY_PROFILE", "headers": {} } }
// Claude uses a slightly different implementation // Fot this, it launches the MCP server on your local machine "mcpServers": { "mcp-server-infranodus": { "command": "npx", "args": [ "-y", "@smithery/cli@latest", "run", "@infranodus/mcp-server-infranodus", "--key", "YOUR_SMITHERY_KEY", "--profile", "YOUR_SMITHERY_PROFILE" ] } }
Note, in both cases, you'll automatically get the YOUR_SMITHERY_KEY and YOUR_SMITHERY_PROFILE values from Smithery when you copy the URL with credentials. These are not your InfraNodus API keys. You can use the InfraNodus API server without the API for the first 70 calls. Then you can add it to your Smithery profile and it will automatically connect to your account using the link above.
To use InfraNodus, see the tools available and simply call them through the chat interface (e.g. "show me the graphs where I talk about this topic" or "get the content gaps from the document I uploaded")
If your client is not using InfraNodus for some actions, add the instruction to use InfraNodus explicitly.
You can deploy the InfraNodus server manually via npx — a package that allows to execute local and remote Node.Js packages on your computer.
The InfraNodus MCP server is also available as an npm package at https://www.npmjs.com/package/infranodus-mcp-server from where you can launch it remotely on your local computer with npx. It will expose its tools to the MCP client that will be using this command to launch the server
Just add this in your Claude's configuration file (Settings > Developer > Edit Config), inside the "mcpServers" object where the different servers are listed:
"infranodus": { "command": "npx", "args": ["-y", "infranodus-mcp-server"], "env": { "INFRANODUS_API_KEY": "YOUR_INFRANODUS_API_KEY" } },
Clone and build the server:
git clone https://github.com/yourusername/mcp-server-infranodus.git cd mcp-server-infranodus npm install npm run build:inspect
Note that build:inspect will generate the dist/index.js file which you will then use in your server setup. The standard npm run build command will only build a Smithery file.
Set up your API key:
Create a .env file in the project root:
INFRANODUS_API_KEY=your-api-key-here
Inspect the MCP:
npm run inspect
Open your Claude Desktop configuration file:
open ~/Library/Application\ Support/Claude/claude_desktop_config.json
Add the InfraNodus server configuration:
a. remote launch via npx:
"infranodus": { "command": "npx", "args": ["-y", "infranodus-mcp-server"], "env": { "INFRANODUS_API_KEY": "YOUR_INFRANODUS_API_KEY" } },
b. launch this repo with node:
{ "mcpServers": { "infranodus": { "command": "node", "args": ["/absolute/path/to/mcp-server-infranodus/dist/index.js"], "env": { "INFRANODUS_API_KEY": "your-api-key-here" } } } }
Note: you can leave the INFRANODUS_API_KEY empty in which case you can make 70 free requests after which you will hit quota and will need to add your API key.
Open your Claude Desktop configuration file:
%APPDATA%\Claude\claude_desktop_config.json
Add the InfraNodus server configuration:
a. remote launch via npx:
"infranodus": { "command": "npx", "args": ["-y", "infranodus-mcp-server"], "env": { "INFRANODUS_API_KEY": "YOUR_INFRANODUS_API_KEY" } },
b. launch this repo with node:
{ "mcpServers": { "infranodus": { "command": "node", "args": ["C:\\path\\to\\mcp-server-infranodus\\dist\\index.js"], "env": { "INFRANODUS_API_KEY": "your-api-key-here" } } } }
For other applications supporting MCP, use the following command to start the server via npx:
INFRANODUS_API_KEY=your-api-key npx -y infranodus-mcp-server
or locally
INFRANODUS_API_KEY=your-api-key node /path/to/mcp-server-infranodus/dist/index.js
The server communicates via stdio, so configure your application to run this command and communicate through standard input/output.
Once installed, you can ask Claude to:
npm run dev
Test the server with the MCP Inspector:
npm run build:inspect npm run inspect
npm run build
npm run watch
Analyzes text and generates a knowledge graph.
Parameters:
text (string, required): The text to analyzeincludeStatements (boolean): Include original statements in responsemodifyAnalyzedText (string): Text modification options ("none", "entities", "lemmatize")Retrieves and analyzes an existing graph from your InfraNodus account.
Parameters:
graphName (string, required): Name of the existing graphincludeStatements (boolean): Include statements in responseincludeGraphSummary (boolean): Include graph summaryIdentifies content gaps and missing connections in text.
Parameters:
text (string, required): The text to analyze for gapsFor long-running operations (like SEO analysis), the MCP server supports real-time progress notifications that provide intermediary feedback to AI agents. This allows agents to:
The server implements MCP progress notifications using:
import { ProgressReporter } from "../utils/progress.js"; import { ToolHandlerContext } from "../types/index.js"; handler: async (params: ParamType, context: ToolHandlerContext = {}) => { const progress = new ProgressReporter(context); await progress.report(25, "Fetching data from API..."); // Do work await progress.report(75, "Analyzing results..."); // More work await progress.report(100, "Complete!"); return results; };
The generate_seo_report tool demonstrates this pattern with 6 major progress checkpoints that provide detailed status updates throughout the multi-step analysis process.
# Clean install rm -rf node_modules package-lock.json npm install npm run build
MIT
For issues related to: