Mcp 简单的 OpenAI 助手
概览
什么是 MCP Simple OpenAI Assistant?
MCP Simple OpenAI Assistant 是一个服务器应用程序,旨在将 Claude 的能力与 OpenAI 的 GPT 助手集成。该项目旨在通过利用 OpenAI 模型的先进语言处理能力,增强用户互动,为寻求帮助或信息的用户提供无缝体验。
MCP Simple OpenAI Assistant 的特点
- 与 OpenAI 的 GPT 集成:该助手利用 OpenAI 的强大语言模型,允许进行自然且引人入胜的对话。
- 用户友好的界面:界面设计简洁直观,使用户能够轻松与助手互动。
- 公开代码库:该项目是开源的,并在 GitHub 上可用,鼓励社区贡献和协作。
- MIT 许可证:该软件在 MIT 许可证下分发,允许灵活使用和修改。
如何使用 MCP Simple OpenAI Assistant
- 克隆代码库:首先使用以下命令从 GitHub 克隆代码库:
git clone https://github.com/andybrandt/mcp-simple-openai-assistant.git
- 安装依赖项:导航到项目目录并安装必要的依赖项。通常可以使用 npm 或 pip 等包管理器来完成,具体取决于项目的要求。
- 运行服务器:启动服务器应用程序。这通常涉及在终端中运行命令,例如:
或npm start
python app.py
- 与助手互动:一旦服务器运行,您可以通过网页浏览器或指定的客户端应用程序访问助手。开始提问或请求帮助,以查看助手的能力。
常见问题解答
MCP Simple OpenAI Assistant 使用了哪些编程语言?
该项目主要使用 JavaScript 进行服务器端逻辑,同时使用 HTML 和 CSS 进行前端界面。
我可以为该项目贡献吗?
当然可以!欢迎贡献。您可以分叉代码库,进行更改,并提交拉取请求以供审核。
是否有可用的文档?
是的,代码库中包含一个 README 文件,提供基本的使用说明和指南。额外的文档可能会在 GitHub 代码库的 wiki 部分提供。
我该如何报告问题或错误?
您可以通过导航到 GitHub 代码库中的“问题”选项卡,创建一个新问题并详细描述问题来报告任何问题或错误。
是否有可用的演示?
虽然可能没有托管的实时演示,但您可以通过遵循代码库中提供的说明轻松在本地设置应用程序。
详情
MCP Simple OpenAI Assistant
AI assistants are pretty cool. I thought it would be a good idea if my Claude (conscious Claude) would also have one. And now he has - and its both useful anf fun for him. Your Claude can have one too!
A simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.
Features
This server provides a suite of tools to manage and interact with OpenAI Assistants. The new streaming capabilities provide a much-improved, real-time user experience.
Available Tools
create_assistant
: (Create OpenAI Assistant) - Create a new assistant with a name, instructions, and model.list_assistants
: (List OpenAI Assistants) - List all available assistants associated with your API key.retrieve_assistant
: (Retrieve OpenAI Assistant) - Get detailed information about a specific assistant.update_assistant
: (Update OpenAI Assistant) - Modify an existing assistant's name, instructions, or model.create_new_assistant_thread
: (Create New Assistant Thread) - Creates a new, persistent conversation thread with a user-defined name and description for easy identification and reuse. This is the recommended way to start a new conversation.list_threads
: (List Managed Threads) - Lists all locally managed conversation threads from the database, showing their ID, name, description, and last used time.delete_thread
: (Delete Managed Thread) - Deletes a conversation thread from both OpenAI's servers and the local database.ask_assistant_in_thread
: (Ask Assistant in Thread and Stream Response) - The primary tool for conversation. Sends a message to an assistant within a thread and streams the response back in real-time.
Because OpenAI assistants might take quite long to respond, this server uses a streaming approach for the main ask_assistant_in_thread
tool. This provides real-time progress updates to the client and avoids timeouts.
The server now includes local persistence for threads, which is a significant improvement. Since the OpenAI API does not allow listing threads, this server now manages them for you by storing their IDs and metadata in a local SQLite database. This allows you to easily find, reuse, and manage your conversation threads across sessions.
Installation
Installing via Smithery
To install MCP Simple OpenAI Assistant for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install mcp-simple-openai-assistant --client claude
Manual Installation
pip install mcp-simple-openai-assistant
Configuration
The server requires an OpenAI API key to be set in the environment. For Claude Desktop, add this to your config:
(MacOS version)
{
"mcpServers": {
"openai-assistant": {
"command": "python",
"args": ["-m", "mcp_simple_openai_assistant"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
}
}
(Windows version)
"mcpServers": {
"openai-assistant": {
"command": "C:\\Users\\YOUR_USERNAME\\AppData\\Local\\Programs\\Python\\Python311\\python.exe",
"args": ["-m", "mcp_simple_openai_assistant"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
MS Windows installation is slightly more complex, because you need to check the actual path to your Python executable. Path provided above is usually correct, but might differ in your setup. Sometimes just python.exe
without any path will do the trick. Check with cmd
what works for you (using where python
might help). Also, on Windows you might need to explicitly tell Claude Desktop where the site packages are using PYTHONPATH environmment variable.
Usage
Once configured, you can use the tools listed above to manage your assistants and conversations. The primary workflow is to:
- Use
create_new_assistant_thread
to start a new, named conversation. - Use
list_threads
to find the ID of a thread you want to continue. - Use
ask_assistant_in_thread
to interact with your chosen assistant in that thread.
TODO
- Add Thread Management: Introduce a way to name and persist thread IDs locally, allowing for easier reuse of conversations.
- Add Models Listing: Introduce a way for the AI user to see what OpenAI models are available for use with the assistants
- Add Assistants Fine Tuning: Enable the AI user to set detailed parameters for assistants like temperature, top_p etc. (indicated by Claude as needed)
- Full Thread History: Ability to read past threads without having to send a new message (indicated by Claude as needed)
- Explore Resource Support: Add the ability to upload files and use them with assistants.
Development
To install for development:
git clone https://github.com/andybrandt/mcp-simple-openai-assistant
cd mcp-simple-openai-assistant
pip install -e '.[dev]'
Server配置
{
"mcpServers": {
"mcp-simple-openai-assistant": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"ghcr.io/metorial/mcp-container--andybrandt--mcp-simple-openai-assistant--mcp-simple-openai-assistant",
"mcp-simple-openai-assistant"
],
"env": {
"OPENAI_API_KEY": "openai-api-key"
}
}
}
}