Deepseek R1 Mcp Server
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
Overview
What is MCP-server-Deepseek_R1?
MCP-server-Deepseek_R1 is a Model Context Protocol (MCP) server implementation that connects Claude Desktop with DeepSeek's language models, specifically the R1 and V3 versions. This server facilitates seamless communication and interaction between the user interface and the underlying language processing capabilities of DeepSeek, allowing for enhanced functionality and user experience.
Features of MCP-server-Deepseek_R1
- Integration with Claude Desktop: The server is designed to work directly with Claude Desktop, providing a user-friendly interface for accessing DeepSeek's language models.
- Support for Multiple Models: It supports both R1 and V3 versions of DeepSeek's language models, offering flexibility and options for users.
- Open Source: The project is publicly available, allowing developers to contribute, modify, and enhance the server according to their needs.
- Active Community: With a growing number of stars and forks, the project has an active community that contributes to its development and improvement.
How to Use MCP-server-Deepseek_R1
- Installation: Clone the repository from GitHub using the command:
git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
- Setup: Follow the instructions in the README file to set up the server environment and dependencies.
- Running the Server: Start the server using the provided scripts or commands, ensuring that all configurations are set correctly for your environment.
- Connecting to Claude Desktop: Once the server is running, connect it with Claude Desktop to begin utilizing the language models for various applications.
Frequently Asked Questions
What is the purpose of MCP in this context?
The Model Context Protocol (MCP) serves as a communication framework that allows different applications to interact with language models effectively, ensuring that the context is maintained throughout the interaction.
Is MCP-server-Deepseek_R1 suitable for production use?
Yes, the server is designed for both development and production environments, but users should thoroughly test it in their specific use cases to ensure stability and performance.
How can I contribute to the project?
You can contribute by forking the repository, making your changes, and submitting a pull request. Additionally, reporting issues or suggesting features is highly encouraged.
What are the system requirements for running MCP-server-Deepseek_R1?
The system requirements may vary based on the specific configurations and models used. Generally, a modern server with adequate RAM and processing power is recommended to handle the language processing tasks efficiently.
Where can I find more information about the project?
For more details, documentation, and updates, you can visit the GitHub repository.
Details
Deepseek R1 MCP Server
A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model. Deepseek R1 is a powerful language model optimized for reasoning tasks with a context window of 8192 tokens.
Why Node.js? This implementation uses Node.js/TypeScript as it provides the most stable integration with MCP servers. The Node.js SDK offers better type safety, error handling, and compatibility with Claude Desktop.
<a href="https://glama.ai/mcp/servers/qui5thpyvu"><img width="380" height="200" src="https://glama.ai/mcp/servers/qui5thpyvu/badge" alt="Deepseek R1 Server MCP server" /></a>
Quick Start
Installing manually
### Clone and install
git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
cd deepseek-r1-mcp
npm install
### Set up environment
cp .env.example .env # Then add your API key
### Build and run
npm run build
Prerequisites
- Node.js (v18 or higher)
- npm
- Claude Desktop
- Deepseek API key
Model Selection
By default, this server uses the deepseek-R1 model. If you want to use DeepSeek-V3 instead, modify the model name in src/index.ts
:
// For DeepSeek-R1 (default)
model: "deepseek-reasoner"
// For DeepSeek-V3
model: "deepseek-chat"
Project Structure
deepseek-r1-mcp/
├── src/
│ ├── index.ts # Main server implementation
├── build/ # Compiled files
│ ├── index.js
├── LICENSE
├── README.md
├── package.json
├── package-lock.json
└── tsconfig.json
Configuration
- Create a
.env
file:
DEEPSEEK_API_KEY=your-api-key-here
- Update Claude Desktop configuration:
{
"mcpServers": {
"deepseek_r1": {
"command": "node",
"args": ["/path/to/deepseek-r1-mcp/build/index.js"],
"env": {
"DEEPSEEK_API_KEY": "your-api-key"
}
}
}
}
Development
npm run dev # Watch mode
npm run build # Build for production
Features
- Advanced text generation with Deepseek R1 (8192 token context window)
- Configurable parameters (max_tokens, temperature)
- Robust error handling with detailed error messages
- Full MCP protocol support
- Claude Desktop integration
- Support for both DeepSeek-R1 and DeepSeek-V3 models
API Usage
{
"name": "deepseek_r1",
"arguments": {
"prompt": "Your prompt here",
"max_tokens": 8192, // Maximum tokens to generate
"temperature": 0.2 // Controls randomness
}
}
The Temperature Parameter
The default value of temperature
is 0.2.
Deepseek recommends setting the temperature
according to your specific use case:
| USE CASE | TEMPERATURE | EXAMPLE | |-|-|| | Coding / Math | 0.0 | Code generation, mathematical calculations | | Data Cleaning / Data Analysis | 1.0 | Data processing tasks | | General Conversation | 1.3 | Chat and dialogue | | Translation | 1.3 | Language translation | | Creative Writing / Poetry | 1.5 | Story writing, poetry generation |
Error Handling
The server provides detailed error messages for common issues:
- API authentication errors
- Invalid parameters
- Rate limiting
- Network issues
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT
Server Config
{
"mcpServers": {
"mcp-server-deepseek-r-1": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"ghcr.io/metorial/mcp-container--66julienmartin--mcp-server-deepseek_r1--mcp-server-deepseek-r-1",
"npm run start"
],
"env": {
"DEEPSEEK_API_KEY": "deepseek-api-key"
}
}
}
}