DuckDuckGo Search MCP Server logo

DuckDuckGo Search MCP Server

by nickclyde

A Model Context Protocol (MCP) server that provides web search capabilities through DuckDuckGo. It includes features for content fetching and parsing, designed for LLM consumption.

View on GitHub

Last updated: N/A

What is DuckDuckGo Search MCP Server?

This is a Model Context Protocol (MCP) server that allows you to perform web searches using DuckDuckGo. It also provides tools to fetch and parse content from webpages, making it easier to integrate web data into applications, especially those using Large Language Models (LLMs).

How to use DuckDuckGo Search MCP Server?

The server can be installed using Smithery or directly via uv. Once installed, it can be integrated with applications like Claude Desktop by configuring the application to use the server's command. The server provides tools for searching and content fetching, which can be accessed programmatically.

Key features of DuckDuckGo Search MCP Server

  • Web Search with DuckDuckGo

  • Content Fetching and Parsing

  • Rate Limiting

  • Error Handling and Logging

  • LLM-Friendly Output

Use cases of DuckDuckGo Search MCP Server

  • Providing real-time information to LLMs

  • Augmenting LLM responses with web search results

  • Automated content extraction from websites

  • Building search-enabled applications

  • Integrating web data into conversational AI agents

FAQ from DuckDuckGo Search MCP Server

What is MCP?

MCP stands for Model Context Protocol, a standard for connecting tools and services to Large Language Models.

How does rate limiting work?

The server implements rate limiting to avoid being blocked by DuckDuckGo. It limits search requests to 30 per minute and content fetching to 20 per minute, with automatic queue management.

What kind of content is fetched?

The content fetching tool retrieves and parses the main text content from a webpage, removing ads and irrelevant elements.

How is the output formatted for LLMs?

The search results and fetched content are cleaned, formatted, and truncated to be easily consumed by Large Language Models.

What license is this project under?

This project is licensed under the MIT License.