Mcp 聊天桌面應用程式

建立者AI-QLAI-QL

一個桌面聊天應用程式,利用MCP(模型上下文協議)與其他大型語言模型(LLMs)進行接口。

概覽

什麼是 chat-mcp?

chat-mcp 是一款桌面聊天應用程式,利用模型上下文協議(MCP)來促進與各種大型語言模型(LLMs)的通信。這款創新的應用程式使用戶能夠無縫地與多個 AI 模型互動,通過利用每個模型的獨特能力來增強聊天體驗。

chat-mcp 的特點

  • 多模型支持:chat-mcp 可以連接到各種 LLM,使用戶能夠選擇最適合其需求的模型。
  • 用戶友好的界面:該應用程式設計簡潔直觀,使用戶能夠輕鬆導航和利用其功能。
  • 實時通信:用戶可以進行實時對話,從連接的 LLM 獲得即時回應。
  • 可自定義設置:用戶可以調整設置,以根據自己的偏好定制聊天體驗。
  • 開源:作為一個開源項目,chat-mcp 鼓勵社區貢獻和開發透明度。

如何使用 chat-mcp

  1. 下載和安裝:訪問 chat-mcp 倉庫 下載最新版本的應用程式。
  2. 設置您的帳戶:按照屏幕上的指示創建帳戶或登錄。
  3. 連接到 LLM:從可用的 LLM 中選擇要連接的模型。您可以根據需要在模型之間切換。
  4. 開始聊天:在聊天窗口中輸入內容開始對話。LLM 將根據提供的上下文作出回應。
  5. 探索功能:利用可自定義的設置來增強您的聊天體驗。

常見問題

什麼是模型上下文協議(MCP)?

模型上下文協議(MCP)是一個框架,允許不同的 AI 模型進行通信和共享上下文,使對話更加連貫和上下文相關。

chat-mcp 是免費使用的嗎?

是的,chat-mcp 是一個開源應用程式,這意味著它可以免費下載和使用。如果您願意,還可以為其開發做出貢獻。

我可以為 chat-mcp 項目做貢獻嗎?

當然可以!歡迎貢獻。您可以在 GitHub 倉庫 提交問題、功能請求或甚至拉取請求。

chat-mcp 支持哪些平台?

chat-mcp 設計為跨平台,支持主要操作系統,如 Windows、macOS 和 Linux。

我該如何報告錯誤或問題?

如果您遇到任何錯誤或問題,請在倉庫的 問題頁面 上報告。您的反饋對於改進應用程式非常重要。

通過使用 chat-mcp,用戶可以增強與 AI 模型的互動,使對話更加引人入勝和富有信息。

詳細

MCP Chat Desktop App

A Cross-Platform Interface for LLMs

This desktop application utilizes the MCP (Model Context Protocol) to seamlessly connect and interact with various Large Language Models (LLMs). Built on Electron, the app ensures full cross-platform compatibility, enabling smooth operation across different operating systems.

The primary objective of this project is to deliver a clean, minimalistic codebase that simplifies understanding the core principles of MCP. Additionally, it provides a quick and efficient way to test multiple servers and LLMs, making it an ideal tool for developers and researchers alike.

News

This project originated as a modified version of Chat-UI, initially adopting a minimalist code approach to implement core MCP functionality for educational purposes.

Through iterative updates to MCP, I received community feedback advocating for a completely new architecture - one that eliminates third-party CDN dependencies and establishes clearer modular structure to better support derivative development and debugging workflows.

This led to the creation of Tool Unitary User Interface, a restructured desktop application optimized for AI-powered development. Building upon the original foundation, TUUI serves as a practical AI-assisted development paradigm, if you're interested, you can also leverage AI to develop new features for TUUI. The platform employs a strict linting and formatting system to ensure AI-generated code adheres to coding standards.

📢 Update: June 2025
The current project refactoring has been largely completed, and a pre-release version is now available. Please refer to the following documentation for details:

Features

  • Cross-Platform Compatibility: Supports Linux, macOS, and Windows.

  • Flexible Apache-2.0 License: Allows easy modification and building of your own desktop applications.

  • Dynamic LLM Configuration: Compatible with all OpenAI SDK-supported LLMs, enabling quick testing of multiple backends through manual or preset configurations.

  • Multi-Client Management: Configure and manage multiple clients to connect to multiple servers using MCP config.

  • UI Adaptability: The UI can be directly extracted for web use, ensuring consistent ecosystem and interaction logic across web and desktop versions.

Architecture

Adopted a straightforward architecture consistent with the MCP documentation to facilitate a clear understanding of MCP principles by:

DeepWiki

How to use

After cloning or downloading this repository:

  1. Please modify the config.json file located in src/main.
    Ensure that the command and path specified in the args are valid.

  2. Please ensure that Node.js is installed on your system.
    You can verify this by running node -v and npm -v in your terminal to check their respective versions.

  3. npm install

  4. npm start

Configuration

Create a .json file and paste the following content into it. This file can then be provided as the interface configuration for the Chat UI.

  • gtp-api.json

    {
        "chatbotStore": {
            "apiKey": "",
            "url": "https://api.aiql.com",
            "path": "/v1/chat/completions",
            "model": "gpt-4o-mini",
            "max_tokens_value": "",
            "mcp": true
        },
        "defaultChoiceStore": {
            "model": [
                "gpt-4o-mini",
                "gpt-4o",
                "gpt-4",
                "gpt-4-turbo"
            ]
        }
    }
    

You can replace the 'url' if you have direct access to the OpenAI API.

Alternatively, you can also use another API endpoint that supports function calls:

  • qwen-api.json

    {
        "chatbotStore": {
            "apiKey": "",
            "url": "https://dashscope.aliyuncs.com/compatible-mode",
            "path": "/v1/chat/completions",
            "model": "qwen-turbo",
            "max_tokens_value": "",
            "mcp": true
        },
        "defaultChoiceStore": {
            "model": [
                "qwen-turbo",
                "qwen-plus",
                "qwen-max"
            ]
        }
    }
    
  • deepinfra.json

    {
        "chatbotStore": {
            "apiKey": "",
            "url": "https://api.deepinfra.com",
            "path": "/v1/openai/chat/completions",
            "model": "meta-llama/Meta-Llama-3.1-70B-Instruct",
            "max_tokens_value": "32000",
            "mcp": true
        },
        "defaultChoiceStore": {
            "model": [
                "meta-llama/Meta-Llama-3.1-70B-Instruct",
                "meta-llama/Meta-Llama-3.1-405B-Instruct",
                "meta-llama/Meta-Llama-3.1-8B-Instruct"
            ]
        }
    }
    

Build Application

You can build your own desktop application by:

npm run build-app

This CLI helps you build and package your application for your current OS, with artifacts stored in the /artifacts directory.

For Debian/Ubuntu users experiencing RPM build issues, try one of the following solutions:

  • Edit package.json to skip the RPM build step. Or

  • Install rpm using sudo apt-get install rpm (You may need to run sudo apt update to ensure your package list is up-to-date)

Troubleshooting

Error: spawn npx ENOENT - ISSUE 40

Modify the config.json in src/main

On windows, npx may not work, please refer my workaround: ISSUE 101

  • Or you can use node in config.json:
    {
        "mcpServers": {
            "filesystem": {
            "command": "node",
            "args": [
                "node_modules/@modelcontextprotocol/server-filesystem/dist/index.js",
                "D:/Github/mcp-test"
            ]
            }
        }
    }
    

Please ensure that the provided path is valid, especially if you are using a relative path. It is highly recommended to provide an absolute path for better clarity and accuracy.

By default, I will install server-everything, server-filesystem, and server-puppeteer for test purposes. However, you can install additional server libraries or use npx to utilize other server libraries as needed.

Installation timeout

Generally, after executing npm install for the entire project, the total size of files in the node_modules directory typically exceeds 500MB.

If the installation process stalls at less than 300MB and the progress bar remains static, it is likely due to a timeout during the installation of the latter part, specifically Electron.

This issue often arises because the download speed from Electron's default server is excessively slow or even inaccessible in certain regions. To resolve this, you can modify the environment or global variable ELECTRON_MIRROR to switch to an Electron mirror site that is accessible from your location.

Electron builder timeout

When using electron-builder to package files, it automatically downloads several large release packages from GitHub. If the network connection is unstable, this process may be interrupted or timeout.

On Windows, you may need to clear the cache located under the electron and electron-builder directories within C:\Users\YOURUSERNAME\AppData\Local before attempting to retry.

Due to potential terminal permission issues, it is recommended to use the default shell terminal instead of VSCode's built-in terminal.

Demo

Multimodal Support

Reasoning and Latex Support

MCP Tools Visualization

MCP Toolcall Process Overview

MCP Prompts Template

Dynamic LLM Config

DevTool Troubleshooting

伺服器配置

{
  "mcpServers": {
    "chat-mcp": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "ghcr.io/metorial/mcp-container--ai-ql--chat-mcp--chat-mcp",
        "npm run start"
      ],
      "env": {}
    }
  }
}

專案資訊

作者
AI-QL
Category
通訊
建立於
Jul 14, 2025
星標
229
語言
HTML

Mcp 聊天桌面應用程式 替代方案

若您需要Mcp 聊天桌面應用程式 的一些替代方案,我們依分類為您提供相關網站。

Agentmail 工具包

第二快的AI聊天機器人™

MCP 類型機器人

查看更多 >>