Deepseek Thinker MCP Server
by ruixingshi
The Deepseek Thinker MCP Server provides Deepseek reasoning content to MCP-enabled AI Clients, such as Claude Desktop. It supports accessing Deepseek's thought processes from the Deepseek API service or a local Ollama server.
Last updated: N/A
What is Deepseek Thinker MCP Server?
The Deepseek Thinker MCP Server is a Model Context Protocol (MCP) provider that allows AI clients to access Deepseek's reasoning capabilities. It acts as a bridge between Deepseek's reasoning engine (either via API or local Ollama server) and AI clients like Claude Desktop, enabling them to leverage Deepseek's thought processes.
How to use Deepseek Thinker MCP Server?
To use the server, configure your AI client (e.g., Claude Desktop) to connect to the MCP server. This involves adding a configuration block specifying the command to run the server, any necessary arguments, and environment variables for API keys or Ollama mode. The README provides example configurations for both OpenAI API mode and Ollama mode.
Key features of Deepseek Thinker MCP Server
Dual Mode Support (OpenAI API and Ollama)
Focused Reasoning (captures Deepseek's thinking process)
Provides structured reasoning output
Integration with MCP-enabled AI Clients
Use cases of Deepseek Thinker MCP Server
Enhancing AI client reasoning capabilities
Accessing Deepseek's thought processes
Integrating Deepseek's reasoning into AI workflows
Using Deepseek as a reasoning engine for AI applications
FAQ from Deepseek Thinker MCP Server
What does 'MCP error -32001: Request timed out' mean?
What does 'MCP error -32001: Request timed out' mean?
This error occurs when the Deepseek API response is too slow or when the reasoning content output is too long, causing the MCP server to timeout.
How do I configure the server to use the OpenAI API?
How do I configure the server to use the OpenAI API?
Set the environment variables API_KEY and BASE_URL with your OpenAI API key and base URL, respectively.
How do I configure the server to use Ollama?
How do I configure the server to use Ollama?
Set the environment variable USE_OLLAMA to 'true'.
What is the purpose of the 'get-deepseek-thinker' tool?
What is the purpose of the 'get-deepseek-thinker' tool?
The 'get-deepseek-thinker' tool performs reasoning using the Deepseek model based on the provided originPrompt and returns a structured text response containing the reasoning process.
What is the tech stack used to build this server?
What is the tech stack used to build this server?
The server is built using TypeScript, @modelcontextprotocol/sdk, OpenAI API, Ollama, and Zod (for parameter validation).