Mcp 服务器用于对讲机
概览
什么是用于对讲机的MCP服务器?
用于对讲机的MCP服务器是一种创新解决方案,旨在增强使用对讲机平台的企业的沟通和集成能力。该服务器充当桥梁,允许各种应用程序与对讲机系统之间无缝数据交换,从而改善客户参与和支持流程。
用于对讲机的MCP服务器的特点
- 实时数据同步:确保所有客户互动和数据在各个平台上实时更新。
- 可定制的集成:提供灵活性,以便与各种第三方应用程序集成,满足特定的业务需求。
- 用户友好的界面:设计直观的界面,简化客户互动的管理。
- 可扩展性:易于扩展,以适应不断增长的业务需求和增加的客户互动。
- 强大的安全性:实施先进的安全措施,以保护传输过程中敏感的客户数据。
如何使用用于对讲机的MCP服务器
- 安装:首先从官方库下载用于对讲机的MCP服务器。按照文档中提供的安装说明进行操作。
- 配置:配置服务器设置,以连接您的对讲机帐户和您希望集成的其他应用程序。
- 测试:进行测试,以确保数据正确同步,并且所有集成都按预期正常运行。
- 部署:测试完成后,在您的生产环境中部署服务器。
- 监控和维护:定期监控服务器性能,并根据需要更新配置,以适应不断变化的业务需求。
常见问题解答
MCP服务器支持哪些平台?
MCP服务器支持各种平台,包括CRM系统、电子商务平台和其他客户参与工具。
使用MCP服务器是否需要费用?
用于对讲机的MCP服务器是开源的,可以免费使用,但可能会因第三方集成或托管服务而产生额外费用。
我如何为MCP服务器项目做贡献?
欢迎贡献!您可以通过报告问题、建议功能或通过GitHub库提交代码改进来贡献。
MCP服务器提供什么样的支持?
支持通过社区论坛、GitHub问题页面和文档提供。用户还可以向其他有经验的开发人员寻求帮助。
我可以自定义MCP服务器吗?
是的,服务器设计为可定制。您可以修改代码以满足特定的业务需求,并根据需要集成额外的功能。
详情
MCP Server for Intercom
<a href="https://glama.ai/mcp/servers/@raoulbia-ai/mcp-server-for-intercom"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@raoulbia-ai/mcp-server-for-intercom/badge" /> </a>An MCP-compliant server that enables AI assistants to access and analyze customer support data from Intercom.
Features
- Search conversations and tickets with advanced filtering
- Filter by customer, status, date range, and keywords
- Search by email content even when no contact exists
- Efficient server-side filtering via Intercom's search API
- Seamless integration with MCP-compliant AI assistants
Installation
Prerequisites
- Node.js 18.0.0 or higher
- An Intercom account with API access
- Your Intercom API token (available in your Intercom account settings)
Quick Setup
Using NPM
### Install the package globally
npm install -g mcp-server-for-intercom
### Set your Intercom API token
export INTERCOM_ACCESS_TOKEN="your_token_here"
### Run the server
intercom-mcp
Using Docker
The default Docker configuration is optimized for Glama compatibility:
### Start Docker (if not already running)
### On Windows: Start Docker Desktop application
### On Linux: sudo systemctl start docker
### Build the image
docker build -t mcp-intercom .
### Run the container with your API token and port mappings
docker run --rm -it -p 3000:3000 -p 8080:8080 -e INTERCOM_ACCESS_TOKEN="your_token_here" mcp-intercom:latest
Validation Steps:
### Test the server status
curl -v http://localhost:8080/.well-known/glama.json
### Test the MCP endpoint
curl -X POST -H "Content-Type: application/json" -d '{"jsonrpc":"2.0","id":1,"method":"mcp.capabilities"}' http://localhost:3000
Alternative Standard Version
If you prefer a lighter version without Glama-specific dependencies:
### Build the standard image
docker build -t mcp-intercom-standard -f Dockerfile.standard .
### Run the standard container
docker run --rm -it -p 3000:3000 -p 8080:8080 -e INTERCOM_ACCESS_TOKEN="your_token_here" mcp-intercom-standard:latest
The default version includes specific dependencies and configurations required for integration with the Glama platform, while the standard version is more lightweight.
Available MCP Tools
1. list_conversations
Retrieves all conversations within a date range with content filtering.
Parameters:
startDate
(DD/MM/YYYY) – Start date (required)endDate
(DD/MM/YYYY) – End date (required)keyword
(string) – Filter to include conversations with this textexclude
(string) – Filter to exclude conversations with this text
Notes:
- Date range must not exceed 7 days
- Uses efficient server-side filtering via Intercom's search API
Example:
{
"startDate": "15/01/2025",
"endDate": "21/01/2025",
"keyword": "billing"
}
2. search_conversations_by_customer
Finds conversations for a specific customer.
Parameters:
customerIdentifier
(string) – Customer email or Intercom ID (required)startDate
(DD/MM/YYYY) – Optional start dateendDate
(DD/MM/YYYY) – Optional end datekeywords
(array) – Optional keywords to filter by content
Notes:
- Can find conversations by email content even if no contact exists
- Resolves emails to contact IDs for efficient searching
Example:
{
"customerIdentifier": "customer@example.com",
"startDate": "15/01/2025",
"endDate": "21/01/2025",
"keywords": ["billing", "refund"]
}
3. search_tickets_by_status
Retrieves tickets by their status.
Parameters:
status
(string) – "open", "pending", or "resolved" (required)startDate
(DD/MM/YYYY) – Optional start dateendDate
(DD/MM/YYYY) – Optional end date
Example:
{
"status": "open",
"startDate": "15/01/2025",
"endDate": "21/01/2025"
}
4. search_tickets_by_customer
Finds tickets associated with a specific customer.
Parameters:
customerIdentifier
(string) – Customer email or Intercom ID (required)startDate
(DD/MM/YYYY) – Optional start dateendDate
(DD/MM/YYYY) – Optional end date
Example:
{
"customerIdentifier": "customer@example.com",
"startDate": "15/01/2025",
"endDate": "21/01/2025"
}
Configuration with Claude Desktop
Add to your claude_desktop_config.json
:
{
"mcpServers": {
"intercom-mcp": {
"command": "intercom-mcp",
"args": [],
"env": {
"INTERCOM_ACCESS_TOKEN": "your_intercom_api_token"
}
}
}
}
Implementation Notes
For detailed technical information about how this server integrates with Intercom's API, see src/services/INTERCOM_API_NOTES.md
. This document explains our parameter mapping, Intercom endpoint usage, and implementation details for developers.
Development
### Clone and install dependencies
git clone https://github.com/raoulbia-ai/mcp-server-for-intercom.git
cd mcp-server-for-intercom
npm install
### Build and run for development
npm run build
npm run dev
### Run tests
npm test
Disclaimer
This project is an independent integration and is not affiliated with, officially connected to, or endorsed by Intercom Inc. "Intercom" is a registered trademark of Intercom Inc.
License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Server配置
{
"mcpServers": {
"mcp-server-for-intercom": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"ghcr.io/metorial/mcp-container--raoulbia-ai--mcp-server-for-intercom--mcp-server-for-intercom",
"npm run start"
],
"env": {
"INTERCOM_ACCESS_TOKEN": "intercom-access-token"
}
}
}
}