Pinecone 模型上下文协议服务器用于 Claude 桌面。

创建者sirmewssirmews

模型上下文协议服务器,以允许从Pinecone进行读取和写入。基础的RAG

概览

什么是 MCP-Pinecone?

MCP-Pinecone 是一个模型上下文协议服务器,旨在促进从 Pinecone(一个向量数据库服务)的读写操作。该服务器提供了基本的检索增强生成(RAG)能力,使开发人员能够无缝地将高级 AI 功能集成到他们的应用程序中。通过利用 MCP-Pinecone,用户可以增强他们的数据检索过程,使访问和利用存储在 Pinecone 中的信息变得更加容易。

MCP-Pinecone 的特点

  • 与 Pinecone 集成:直接连接到 Pinecone,实现高效的数据管理和检索。
  • RAG 能力:支持检索增强生成,通过利用外部数据提高生成响应的质量。
  • 开源:可供公众使用,允许开发人员根据需要贡献和修改代码。
  • 用户友好的界面:以易用性为设计理念,使新手和经验丰富的开发人员都能轻松使用。
  • 活跃的社区:由贡献者社区支持,确保持续改进和支持。

如何使用 MCP-Pinecone

  1. 安装:从 GitHub 克隆代码库并安装必要的依赖项。

    git clone https://github.com/sirmews/mcp-pinecone.git
    cd mcp-pinecone
    npm install
    
  2. 配置:在环境变量或配置文件中设置您的 Pinecone API 密钥和其他配置。

  3. 运行服务器:启动服务器以开始与 Pinecone 交互。

    npm start
    
  4. 发起请求:使用提供的 API 端点读取和写入数据到 Pinecone。请参考文档以获取有关可用端点及其用法的详细说明。

  5. 贡献:如果您希望为项目做出贡献,请随意分叉代码库,进行更改并提交拉取请求。

常见问题解答

什么是 Pinecone?

Pinecone 是一个完全托管的向量数据库,专为机器学习应用设计。它允许用户高效地存储和查询高维向量,非常适合涉及 AI 和数据科学的应用。

MCP-Pinecone 如何增强数据检索?

通过实现检索增强生成(RAG),MCP-Pinecone 允许应用程序基于模型的训练和来自 Pinecone 的实时数据生成响应,从而产生更准确和上下文相关的输出。

MCP-Pinecone 是免费使用的吗?

是的,MCP-Pinecone 是一个开源项目,您可以免费使用。然而,使用 Pinecone 可能会根据您使用其服务的情况产生费用。

我可以为 MCP-Pinecone 项目做贡献吗?

当然可以!欢迎贡献。您可以分叉代码库,进行改进,并提交拉取请求与社区分享您的增强功能。

我在哪里可以找到 MCP-Pinecone 的文档?

文档通常包含在代码库中。您还可以查看 README 文件以获取设置说明和使用指南。

详情

Pinecone Model Context Protocol Server for Claude Desktop.

smithery badge

PyPI - Downloads

Read and write to a Pinecone index.

Components

flowchart TB
    subgraph Client["MCP Client (e.g., Claude Desktop)"]
        UI[User Interface]
    end

    subgraph MCPServer["MCP Server (pinecone-mcp)"]
        Server[Server Class]
        
        subgraph Handlers["Request Handlers"]
            ListRes[list_resources]
            ReadRes[read_resource]
            ListTools[list_tools]
            CallTool[call_tool]
            GetPrompt[get_prompt]
            ListPrompts[list_prompts]
        end
        
        subgraph Tools["Implemented Tools"]
            SemSearch[semantic-search]
            ReadDoc[read-document]
            ListDocs[list-documents]
            PineconeStats[pinecone-stats]
            ProcessDoc[process-document]
        end
    end

    subgraph PineconeService["Pinecone Service"]
        PC[Pinecone Client]
        subgraph PineconeFunctions["Pinecone Operations"]
            Search[search_records]
            Upsert[upsert_records]
            Fetch[fetch_records]
            List[list_records]
            Embed[generate_embeddings]
        end
        Index[(Pinecone Index)]
    end

    %% Connections
    UI --> Server
    Server --> Handlers
    
    ListTools --> Tools
    CallTool --> Tools
    
    Tools --> PC
    PC --> PineconeFunctions
    PineconeFunctions --> Index
    
    %% Data flow for semantic search
    SemSearch --> Search
    Search --> Embed
    Embed --> Index
    
    %% Data flow for document operations
    UpsertDoc --> Upsert
    ReadDoc --> Fetch
    ListRes --> List

    classDef primary fill:#2563eb,stroke:#1d4ed8,color:white
    classDef secondary fill:#4b5563,stroke:#374151,color:white
    classDef storage fill:#059669,stroke:#047857,color:white
    
    class Server,PC primary
    class Tools,Handlers secondary
    class Index storage

Resources

The server implements the ability to read and write to a Pinecone index.

Tools

  • semantic-search: Search for records in the Pinecone index.
  • read-document: Read a document from the Pinecone index.
  • list-documents: List all documents in the Pinecone index.
  • pinecone-stats: Get stats about the Pinecone index, including the number of records, dimensions, and namespaces.
  • process-document: Process a document into chunks and upsert them into the Pinecone index. This performs the overall steps of chunking, embedding, and upserting.

Note: embeddings are generated via Pinecone's inference API and chunking is done with a token-based chunker. Written by copying a lot from langchain and debugging with Claude.

Quickstart

Installing via Smithery

To install Pinecone MCP Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install mcp-pinecone --client claude

Install the server

Recommend using uv to install the server locally for Claude.

uvx install mcp-pinecone

OR

uv pip install mcp-pinecone

Add your config as described below.

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Note: You might need to use the direct path to uv. Use which uv to find the path.

Development/Unpublished Servers Configuration

"mcpServers": {
  "mcp-pinecone": {
    "command": "uv",
    "args": [
      "--directory",
      "{project_dir}",
      "run",
      "mcp-pinecone"
    ]
  }
}

Published Servers Configuration

"mcpServers": {
  "mcp-pinecone": {
    "command": "uvx",
    "args": [
      "--index-name",
      "{your-index-name}",
      "--api-key",
      "{your-secret-api-key}",
      "mcp-pinecone"
    ]
  }
}
Sign up to Pinecone

You can sign up for a Pinecone account here.

Get an API key

Create a new index in Pinecone, replacing {your-index-name} and get an API key from the Pinecone dashboard, replacing {your-secret-api-key} in the config.

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory {project_dir} run mcp-pinecone

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Source Code

The source code is available on GitHub.

Contributing

Send your ideas and feedback to me on Bluesky or by opening an issue.

Server配置

{
  "mcpServers": {
    "mcp-pinecone": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "ghcr.io/metorial/mcp-container--sirmews--mcp-pinecone--mcp-pinecone",
        "mcp-pinecone --index-name index-name --api-key api-key"
      ],
      "env": {}
    }
  }
}

项目信息

作者
sirmews
创建时间
Nov 5, 2025
收藏数
148
语言
Python

Pinecone 模型上下文协议服务器用... 替代方案

如果你需要 Pinecone 模型上下文协议服务器用... 的一些替代方案,我们为你提供了按类别划分的网站。

MCP 服务器用于运行带有 Newman 的 Postman 集合

一个模型上下文协议(MCP)服务器,通过OpenRPC提供JSON-RPC功能。

将您的聊天 REPL 连接到 Wolfram Alpha 计算智能

一个简单的MCP服务器,向智能系统和聊天REPL提供日期时间信息

MCP对Claude代码功能及更多的实现

这个项目是一个模型上下文协议(MCP)服务器,用于与VRChat API进行交互。

SonarQube的模型上下文协议(MCP)服务器

查看更多 >>