
Website Downloader
STDIOMCP server for downloading entire websites with structure preservation and local link conversion
MCP server for downloading entire websites with structure preservation and local link conversion
This MCP server provides a tool to download entire websites using wget. It preserves the website structure and converts links to work locally.
The server requires wget
to be installed on your system.
Using Homebrew:
brew install wget
sudo apt-get update sudo apt-get install wget
sudo dnf install wget
choco install wget
The server provides a tool called download_website
with the following parameters:
url
(required): The URL of the website to downloadoutputPath
(optional): The directory where the website should be downloaded. Defaults to the current directory.depth
(optional): Maximum depth level for recursive downloading. Defaults to infinite. Set to 0 for just the specified page, 1 for direct links, etc.{ "url": "https://example.com", "outputPath": "/path/to/output", "depth": 2 // Optional: Download up to 2 levels deep }
The website downloader:
npm install npm run build
{ "mcpServers": { "website-downloader": { "command": "node", "args": ["/path/to/website-downloader/build/index.js"] } } }