Mcp 伺服器用於對講機
概覽
MCP伺服器是什麼?
MCP伺服器是一個創新的解決方案,旨在增強使用Intercom平台的企業的通信和整合能力。這個伺服器充當橋樑,允許各種應用程序與Intercom系統之間無縫數據交換,從而改善客戶參與和支持流程。
MCP伺服器的特點
- 實時數據同步:確保所有客戶互動和數據在各平台上實時更新。
- 可定制的整合:提供靈活性以整合各種第三方應用程序,根據特定業務需求量身定制。
- 用戶友好的界面:設計直觀的界面,簡化客戶互動的管理。
- 可擴展性:輕鬆擴展以滿足不斷增長的業務需求和增加的客戶互動。
- 強大的安全性:實施先進的安全措施,以保護在傳輸過程中敏感的客戶數據。
如何使用MCP伺服器
- 安裝:首先從官方庫下載MCP伺服器。按照文檔中提供的安裝說明進行操作。
- 配置:配置伺服器設置以連接到您的Intercom帳戶和您希望整合的其他應用程序。
- 測試:進行測試以確保數據正確同步,並且所有整合正常運行。
- 部署:測試完成後,將伺服器部署到您的生產環境中。
- 監控和維護:定期監控伺服器性能,並根據需要更新配置以適應不斷變化的業務需求。
常見問題
MCP伺服器支持哪些平台?
MCP伺服器支持多種平台,包括CRM系統、電子商務平台和其他客戶參與工具。
使用MCP伺服器是否需要費用?
MCP伺服器是開源的,免費使用,但可能會因第三方整合或托管服務而產生額外費用。
我如何能為MCP伺服器項目做出貢獻?
歡迎貢獻!您可以通過報告問題、建議功能或通過GitHub庫提交代碼改進來貢獻。
MCP伺服器提供什麼樣的支持?
支持通過社區論壇、GitHub問題頁面和文檔提供。用戶還可以向其他有經驗的開發者尋求幫助。
我可以自定義MCP伺服器嗎?
是的,該伺服器設計為可定制的。您可以修改代碼以適應特定的業務需求,並根據需要整合額外的功能。
詳細
MCP Server for Intercom
<a href="https://glama.ai/mcp/servers/@raoulbia-ai/mcp-server-for-intercom"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@raoulbia-ai/mcp-server-for-intercom/badge" /> </a>An MCP-compliant server that enables AI assistants to access and analyze customer support data from Intercom.
Features
- Search conversations and tickets with advanced filtering
- Filter by customer, status, date range, and keywords
- Search by email content even when no contact exists
- Efficient server-side filtering via Intercom's search API
- Seamless integration with MCP-compliant AI assistants
Installation
Prerequisites
- Node.js 18.0.0 or higher
- An Intercom account with API access
- Your Intercom API token (available in your Intercom account settings)
Quick Setup
Using NPM
### Install the package globally
npm install -g mcp-server-for-intercom
### Set your Intercom API token
export INTERCOM_ACCESS_TOKEN="your_token_here"
### Run the server
intercom-mcp
Using Docker
The default Docker configuration is optimized for Glama compatibility:
### Start Docker (if not already running)
### On Windows: Start Docker Desktop application
### On Linux: sudo systemctl start docker
### Build the image
docker build -t mcp-intercom .
### Run the container with your API token and port mappings
docker run --rm -it -p 3000:3000 -p 8080:8080 -e INTERCOM_ACCESS_TOKEN="your_token_here" mcp-intercom:latest
Validation Steps:
### Test the server status
curl -v http://localhost:8080/.well-known/glama.json
### Test the MCP endpoint
curl -X POST -H "Content-Type: application/json" -d '{"jsonrpc":"2.0","id":1,"method":"mcp.capabilities"}' http://localhost:3000
Alternative Standard Version
If you prefer a lighter version without Glama-specific dependencies:
### Build the standard image
docker build -t mcp-intercom-standard -f Dockerfile.standard .
### Run the standard container
docker run --rm -it -p 3000:3000 -p 8080:8080 -e INTERCOM_ACCESS_TOKEN="your_token_here" mcp-intercom-standard:latest
The default version includes specific dependencies and configurations required for integration with the Glama platform, while the standard version is more lightweight.
Available MCP Tools
1. list_conversations
Retrieves all conversations within a date range with content filtering.
Parameters:
startDate
(DD/MM/YYYY) – Start date (required)endDate
(DD/MM/YYYY) – End date (required)keyword
(string) – Filter to include conversations with this textexclude
(string) – Filter to exclude conversations with this text
Notes:
- Date range must not exceed 7 days
- Uses efficient server-side filtering via Intercom's search API
Example:
{
"startDate": "15/01/2025",
"endDate": "21/01/2025",
"keyword": "billing"
}
2. search_conversations_by_customer
Finds conversations for a specific customer.
Parameters:
customerIdentifier
(string) – Customer email or Intercom ID (required)startDate
(DD/MM/YYYY) – Optional start dateendDate
(DD/MM/YYYY) – Optional end datekeywords
(array) – Optional keywords to filter by content
Notes:
- Can find conversations by email content even if no contact exists
- Resolves emails to contact IDs for efficient searching
Example:
{
"customerIdentifier": "customer@example.com",
"startDate": "15/01/2025",
"endDate": "21/01/2025",
"keywords": ["billing", "refund"]
}
3. search_tickets_by_status
Retrieves tickets by their status.
Parameters:
status
(string) – "open", "pending", or "resolved" (required)startDate
(DD/MM/YYYY) – Optional start dateendDate
(DD/MM/YYYY) – Optional end date
Example:
{
"status": "open",
"startDate": "15/01/2025",
"endDate": "21/01/2025"
}
4. search_tickets_by_customer
Finds tickets associated with a specific customer.
Parameters:
customerIdentifier
(string) – Customer email or Intercom ID (required)startDate
(DD/MM/YYYY) – Optional start dateendDate
(DD/MM/YYYY) – Optional end date
Example:
{
"customerIdentifier": "customer@example.com",
"startDate": "15/01/2025",
"endDate": "21/01/2025"
}
Configuration with Claude Desktop
Add to your claude_desktop_config.json
:
{
"mcpServers": {
"intercom-mcp": {
"command": "intercom-mcp",
"args": [],
"env": {
"INTERCOM_ACCESS_TOKEN": "your_intercom_api_token"
}
}
}
}
Implementation Notes
For detailed technical information about how this server integrates with Intercom's API, see src/services/INTERCOM_API_NOTES.md
. This document explains our parameter mapping, Intercom endpoint usage, and implementation details for developers.
Development
### Clone and install dependencies
git clone https://github.com/raoulbia-ai/mcp-server-for-intercom.git
cd mcp-server-for-intercom
npm install
### Build and run for development
npm run build
npm run dev
### Run tests
npm test
Disclaimer
This project is an independent integration and is not affiliated with, officially connected to, or endorsed by Intercom Inc. "Intercom" is a registered trademark of Intercom Inc.
License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
伺服器配置
{
"mcpServers": {
"mcp-server-for-intercom": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"ghcr.io/metorial/mcp-container--raoulbia-ai--mcp-server-for-intercom--mcp-server-for-intercom",
"npm run start"
],
"env": {
"INTERCOM_ACCESS_TOKEN": "intercom-access-token"
}
}
}
}