Lara 翻譯 Mcp 伺服器
概覽
什麼是 Lara-MCP?
Lara-MCP 是一個開源項目,托管在 GitHub 上,屬於 "translated" 組織。它是一個強大的工具,旨在幫助開發者將多語言功能整合到他們的 Laravel 應用中。該項目的目標是簡化管理翻譯和本地化的過程,使開發者更容易創建面向全球受眾的應用。
Lara-MCP 的特點
- 多語言支持:Lara-MCP 允許開發者在應用中無縫管理多種語言。
- 易於整合:該套件設計為與 Laravel 平滑整合,利用其現有的功能和特性。
- 用戶友好的界面:該項目提供了一個簡單的界面來管理翻譯,即使對於不擅長本地化的人來說也很容易使用。
- 社區驅動:作為一個開源項目,Lara-MCP 受益於來自全球開發者的貢獻,確保其保持更新和相關性。
- 文檔:提供全面的文檔,幫助用戶有效理解如何實施和利用 Lara-MCP 的功能。
如何使用 Lara-MCP
-
安裝:首先通過 Composer 安裝 Lara-MCP。您可以運行以下命令:
composer require translated/lara-mcp
-
配置:安裝後,使用以下命令發布配置文件:
php artisan vendor:publish --provider="Translated\LaraMcp\LaraMcpServiceProvider"
-
設置語言:通過編輯配置文件來配置您希望在應用中支持的語言。
-
管理翻譯:使用提供的界面根據需要添加、編輯或刪除翻譯。您還可以導入/導出翻譯文件以便於管理。
-
在視圖中使用翻譯:在您的 Blade 模板中利用翻譯函數,根據用戶偏好顯示正確的語言。
常見問題
Lara-MCP 的目的是什么?
Lara-MCP 的設計旨在促進在 Laravel 應用中整合多語言支持,使開發者更容易管理翻譯和本地化。
Lara-MCP 是免費使用的嗎?
是的,Lara-MCP 是一個開源項目,這意味著它可以在 MIT 許可下免費使用和修改。
我該如何貢獻給 Lara-MCP?
您可以通過提交拉取請求、報告問題或在 GitHub 倉庫上建議功能來貢獻該項目。
Lara-MCP 兼容哪些版本的 Laravel?
Lara-MCP 兼容 Laravel 8 及以上版本。請始終查看文檔以獲取最新的兼容性信息。
我在哪裡可以找到 Lara-MCP 的文檔?
文檔可在 GitHub 倉庫上找到,提供有關安裝、配置和使用的詳細說明。
詳細
Lara Translate MCP Server
A Model Context Protocol (MCP) Server for Lara Translate API, enabling powerful translation capabilities with support for language detection, context-aware translations and translation memories.
📚 Table of Contents
- 📖 Introduction
- 🛠 Available Tools
- 🚀 Getting Started
- 🧩 Installation Engines
- 💻 Popular Clients that supports MCPs
- 🆘 Support
📖 Introduction
<details> <summary><strong>What is MCP?</strong></summary>Model Context Protocol (MCP) is an open standardized communication protocol that enables AI applications to connect with external tools, data sources, and services. Think of MCP like a USB-C port for AI applications - just as USB-C provides a standardized way to connect devices to various peripherals, MCP provides a standardized way to connect AI models to different data sources and tools.
Lara Translate MCP Server enables AI applications to access Lara Translate's powerful translation capabilities through this standardized protocol.
</details> <details> <summary><strong>How Lara Translate MCP Works</strong></summary>More info about Model Context Protocol on: https://modelcontextprotocol.io/
Lara Translate MCP Server implements the Model Context Protocol to provide seamless translation capabilities to AI applications. The integration follows this flow:
- Connection Establishment: When an MCP-compatible AI application starts, it connects to configured MCP servers, including the Lara Translate MCP Server
- Tool & Resource Discovery: The AI application discovers available translation tools and resources provided by the Lara Translate MCP Server
- Request Processing: When translation needs are identified:
- The AI application formats a structured request with text to translate, language pairs, and optional context
- The MCP server validates the request and transforms it into Lara Translate API calls
- The request is securely sent to Lara Translate's API using your credentials
- Translation & Response: Lara Translate processes the translation using advanced AI models
- Result Integration: The translation results are returned to the AI application, which can then incorporate them into its response
This integration architecture allows AI applications to access professional-grade translations without implementing the API directly, while maintaining the security of your API credentials and offering flexibility to adjust translation parameters through natural language instructions.
</details> <details> <summary><strong>Why to use Lara inside an LLM</strong></summary>Integrating Lara with LLMs creates a powerful synergy that significantly enhances translation quality for non-English languages.
Why General LLMs Fall Short in Translation
While large language models possess broad linguistic capabilities, they often lack the specialized expertise and up-to-date terminology required for accurate translations in specific domains and languages.
Lara’s Domain-Specific Advantage
Lara overcomes this limitation by leveraging Translation Language Models (T-LMs) trained on billions of professionally translated segments. These models provide domain-specific machine translation that captures cultural nuances and industry terminology that generic LLMs may miss. The result: translations that are contextually accurate and sound natural to native speakers.
Designed for Non-English Strength
Lara has a strong focus on non-English languages, addressing the performance gap found in models such as GPT-4. The dominance of English in datasets such as Common Crawl and Wikipedia results in lower quality output in other languages. Lara helps close this gap by providing higher quality understanding, generation, and restructuring in a multilingual context.
Faster, Smarter Multilingual Performance
By offloading complex translation tasks to specialized T-LMs, Lara reduces computational overhead and minimizes latency—a common issue for LLMs handling non-English input. Its architecture processes translations in parallel with the LLM, enabling for real-time, high-quality output without compromising speed or efficiency.
Cost-Efficient Translation at Scale
Lara also lowers the cost of using models like GPT-4 in non-English workflows. Since tokenization (and pricing) is optimized for English, using Lara allows translation to take place before hitting the LLM, meaning that only the translated English content is processed. This improves cost efficiency and supports competitive scalability for global enterprises.
</details>🛠 Available Tools
Translation Tools
<details> <summary><strong>translate</strong> - Translate text between languages</summary>Inputs:
text
(array): An array of text blocks to translate, each with:text
(string): The text contenttranslatable
(boolean): Whether this block should be translated
source
(optional string): Source language code (e.g., 'en-EN')target
(string): Target language code (e.g., 'it-IT')context
(optional string): Additional context to improve translation qualityinstructions
(optional string[]): Instructions to adjust translation behaviorsource_hint
(optional string): Guidance for language detection
Returns: Translated text blocks maintaining the original structure
</details>Translation Memories Tools
<details> <summary><strong>list_memories</strong> - List saved translation memories</summary>Returns: Array of memories and their details
</details> <details> <summary><strong>create_memory</strong> - Create a new translation memory</summary>Inputs:
name
(string): Name of the new memoryexternal_id
(optional string): ID of the memory to import from MyMemory (e.g., 'ext_my_[MyMemory ID]')
Returns: Created memory data
</details> <details> <summary><strong>update_memory</strong> - Update translation memory name</summary>Inputs:
id
(string): ID of the memory to updatename
(string): The new name for the memory
Returns: Updated memory data
</details> <details> <summary><strong>delete_memory</strong> - Delete a translation memory</summary>Inputs:
id
(string): ID of the memory to delete
Returns: Deleted memory data
</details> <details> <summary><strong>add_translation</strong> - Add a translation unit to memory</summary>Inputs:
id
(string | string[]): ID or IDs of memories where to add the translation unitsource
(string): Source language codetarget
(string): Target language codesentence
(string): The source sentencetranslation
(string): The translated sentencetuid
(optional string): Translation Unit unique identifiersentence_before
(optional string): Context sentence beforesentence_after
(optional string): Context sentence after
Returns: Added translation details
</details> <details> <summary><strong>delete_translation</strong> - Delete a translation unit from memory</summary>Inputs:
id
(string): ID of the memorysource
(string): Source language codetarget
(string): Target language codesentence
(string): The source sentencetranslation
(string): The translated sentencetuid
(optional string): Translation Unit unique identifiersentence_before
(optional string): Context sentence beforesentence_after
(optional string): Context sentence after
Returns: Removed translation details
</details> <details> <summary><strong>import_tmx</strong> - Import a TMX file into a memory</summary>Inputs:
id
(string): ID of the memory to updatetmx
(file path): The path of the TMX file to uploadgzip
(boolean): Indicates if the file is compressed (.gz)
Returns: Import details
</details> <details> <summary><strong>check_import_status</strong> - Checks the status of a TMX file import</summary>Inputs:
id
(string): The ID of the import job
Returns: Import details
</details>🚀 Getting Started
📋 Requirements
- Lara Translate API Credentials
- To get them you can refer to the Official Documentation
- An LLM client that supports Model Context Protocol (MCP), such as Claude Desktop, Cursors, or GitHub Copilot
- NPX or Docker (depending on your preferred installation method)
🔌 Installation
Introduction
The installation process is standardized across all MCP clients. It involves manually adding a configuration object to your client's MCP configuration JSON file.
If you're unsure how to configure an MCP with your client, please refer to your MCP client's official documentation.
Lara Translate MCP supports multiple installation methods, including NPX and Docker.
Below, we'll use NPX as an example.
Installation & Configuration
Step 1: Open your client's MCP configuration JSON file with a text editor, then copy and paste the following snippet:
{
"mcpServers": {
"lara-translate": {
"command": "npx",
"args": [
"-y",
"@translated/lara-mcp@latest"
],
"env": {
"LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
"LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
}
}
}
}
Step 2: Replace <YOUR_ACCESS_KEY_ID>
and <YOUR_ACCESS_KEY_SECRET>
with your Lara Translate API credentials (refer to the Official Documentation for details).
Step 3: Restart your MCP client.
Verify Installation
After restarting your MCP client, you should see Lara Translate MCP in the list of available MCPs.
The method for viewing installed MCPs varies by client. Please consult your MCP client's documentation.
To verify that Lara Translate MCP is working correctly, try translating with a simple prompt:
Translate with Lara "Hello world" to Spanish
Your MCP client will begin generating a response. If Lara Translate MCP is properly installed and configured, your client will either request approval for the action or display a notification that Lara Translate is being used.
🧩 Installation Engines
<details> <summary><strong>Option 1: Using NPX</strong></summary>This option requires Node.js to be installed on your system.
- Add the following to your MCP configuration file:
{
"mcpServers": {
"lara-translate": {
"command": "npx",
"args": ["-y", "@translated/lara-mcp@latest"],
"env": {
"LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
"LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
}
}
}
}
- Replace
<YOUR_ACCESS_KEY_ID>
and<YOUR_ACCESS_KEY_SECRET>
with your actual Lara API credentials.
This option requires Docker to be installed on your system.
- Add the following to your MCP configuration file:
{
"mcpServers": {
"lara-translate": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"LARA_ACCESS_KEY_ID",
"-e",
"LARA_ACCESS_KEY_SECRET",
"translatednet/lara-mcp:latest"
],
"env": {
"LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
"LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
}
}
}
}
- Replace
<YOUR_ACCESS_KEY_ID>
and<YOUR_ACCESS_KEY_SECRET>
with your actual Lara API credentials.
Using Node.js
- Clone the repository:
git clone https://github.com/translated/lara-mcp.git
cd lara-mcp
- Install dependencies and build:
### Install dependencies
pnpm install
### Build
pnpm run build
- Add the following to your MCP configuration file:
{
"mcpServers": {
"lara-translate": {
"command": "node",
"args": ["<FULL_PATH_TO_PROJECT_FOLDER>/dist/index.js"],
"env": {
"LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
"LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
}
}
}
}
- Replace:
<FULL_PATH_TO_PROJECT_FOLDER>
with the absolute path to your project folder<YOUR_ACCESS_KEY_ID>
and<YOUR_ACCESS_KEY_SECRET>
with your actual Lara API credentials.
Building a Docker Image
- Clone the repository:
git clone https://github.com/translated/lara-mcp.git
cd lara-mcp
- Build the Docker image:
docker build -t lara-mcp .
- Add the following to your MCP configuration file:
{
"mcpServers": {
"lara-translate": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"LARA_ACCESS_KEY_ID",
"-e",
"LARA_ACCESS_KEY_SECRET",
"lara-mcp"
],
"env": {
"LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
"LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
}
}
}
}
- Replace
<YOUR_ACCESS_KEY_ID>
and<YOUR_ACCESS_KEY_SECRET>
with your actual credentials.
💻 Popular Clients that supports MCPs
For a complete list of MCP clients and their feature support, visit the official MCP clients page.
| Client | Description | |-|| | Claude Desktop | Desktop application for Claude AI | | Aixplain | Production-ready AI Agents | | Cursor | AI-first code editor | | Cline for VS Code | VS Code extension for AI assistance | | GitHub Copilot MCP | VS Code extension for GitHub Copilot MCP integration | | Windsurf | AI-powered code editor and development environment |
🆘 Support
- For issues with Lara Translate API: Visit Lara Translate API and Integrations Support
- For issues with this MCP Server: Open an issue on GitHub
伺服器配置
{
"mcpServers": {
"lara-mcp": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"ghcr.io/metorial/mcp-container--translated--lara-mcp--lara-mcp",
"pnpm run start"
],
"env": {
"LARA_ACCESS_KEY_ID": "lara-access-key-id",
"LARA_ACCESS_KEY_SECRET": "lara-access-key-secret"
}
}
}
}