🚀 ⚡️ Locust Mcp Server
A Model Context Protocol (MCP) server implementation for running Locust load tests. This server enables seamless integration of Locust load testing capabilities with AI-powered development environments.
Overview
What is Locust MCP Server?
The ### Locust MCP Server is a specialized implementation designed to facilitate load testing using Locust, a popular open-source load testing tool. This server allows users to run load tests efficiently by integrating Locust's capabilities with AI-powered development environments. It provides a robust framework for simulating user traffic and measuring system performance under various conditions, making it an essential tool for developers and testers aiming to ensure their applications can handle expected loads.
Features of Locust MCP Server
- Seamless Integration: The Locust MCP Server integrates smoothly with existing development environments, allowing for easy setup and execution of load tests.
- AI-Powered Capabilities: Leverage AI technologies to enhance load testing strategies, providing insights and optimizations that traditional methods may overlook.
- Scalability: The server is designed to handle a large number of concurrent users, making it suitable for testing applications of all sizes.
- Real-Time Monitoring: Users can monitor the performance of their applications in real-time during load tests, enabling immediate feedback and adjustments.
- Customizable Scenarios: Create tailored load testing scenarios that mimic real-world user behavior, ensuring that tests are relevant and effective.
How to Use Locust MCP Server
- Installation: Begin by installing the Locust MCP Server on your machine or server. Follow the installation instructions provided in the documentation.
- Configuration: Configure the server settings to match your testing requirements. This includes setting up the number of users, test duration, and any specific scenarios you wish to simulate.
- Run Tests: Start the load tests through the server interface. Monitor the progress and results in real-time to gain insights into your application's performance.
- Analyze Results: After the tests are complete, analyze the results to identify bottlenecks, performance issues, and areas for improvement.
- Iterate: Based on the findings, make necessary adjustments to your application and repeat the testing process to ensure optimal performance.
Frequently Asked Questions
Q: What is the primary purpose of the Locust MCP Server?
A: The primary purpose is to facilitate load testing for applications, allowing developers to simulate user traffic and assess performance under various conditions.
Q: Can I integrate Locust MCP Server with other tools?
A: Yes, the Locust MCP Server is designed to integrate with various development and testing tools, enhancing its functionality and usability.
Q: Is there a cost associated with using Locust MCP Server?
A: The Locust MCP Server is open-source and free to use, making it accessible for developers and organizations of all sizes.
Q: How can I contribute to the Locust MCP Server project?
A: Contributions are welcome! You can participate by reporting issues, suggesting features, or submitting code improvements through the project's repository.
Q: Where can I find more information about Locust MCP Server?
A: For more detailed information, documentation, and support, visit the official website at qainsights.com.
Details
🚀 ⚡️ locust-mcp-server
A Model Context Protocol (MCP) server implementation for running Locust load tests. This server enables seamless integration of Locust load testing capabilities with AI-powered development environments.
✨ Features
- Simple integration with Model Context Protocol framework
- Support for headless and UI modes
- Configurable test parameters (users, spawn rate, runtime)
- Easy-to-use API for running Locust load tests
- Real-time test execution output
- HTTP/HTTPS protocol support out of the box
- Custom task scenarios support
🔧 Prerequisites
Before you begin, ensure you have the following installed:
- Python 3.13 or higher
- uv package manager (Installation guide)
📦 Installation
- Clone the repository:
git clone https://github.com/qainsights/locust-mcp-server.git
- Install the required dependencies:
uv pip install -r requirements.txt
- Set up environment variables (optional):
Create a
.env
file in the project root:
LOCUST_HOST=http://localhost:8089 # Default host for your tests
LOCUST_USERS=3 # Default number of users
LOCUST_SPAWN_RATE=1 # Default user spawn rate
LOCUST_RUN_TIME=10s # Default test duration
🚀 Getting Started
- Create a Locust test script (e.g.,
hello.py
):
from locust import HttpUser, task, between
class QuickstartUser(HttpUser):
wait_time = between(1, 5)
@task
def hello_world(self):
self.client.get("/hello")
self.client.get("/world")
@task(3)
def view_items(self):
for item_id in range(10):
self.client.get(f"/item?id={item_id}", name="/item")
time.sleep(1)
def on_start(self):
self.client.post("/login", json={"username":"foo", "password":"bar"})
- Configure the MCP server using the below specs in your favorite MCP client (Claude Desktop, Cursor, Windsurf and more):
{
"mcpServers": {
"locust": {
"command": "/Users/naveenkumar/.local/bin/uv",
"args": [
"--directory",
"/Users/naveenkumar/Gits/locust-mcp-server",
"run",
"locust_server.py"
]
}
}
}
- Now ask the LLM to run the test e.g.
run locust test for hello.py
. The Locust MCP server will use the following tool to start the test:
run_locust
: Run a test with configurable options for headless mode, host, runtime, users, and spawn rate
📝 API Reference
Run Locust Test
run_locust(
test_file: str,
headless: bool = True,
host: str = "http://localhost:8089",
runtime: str = "10s",
users: int = 3,
spawn_rate: int = 1
)
Parameters:
test_file
: Path to your Locust test scriptheadless
: Run in headless mode (True) or with UI (False)host
: Target host to load testruntime
: Test duration (e.g., "30s", "1m", "5m")users
: Number of concurrent users to simulatespawn_rate
: Rate at which users are spawned
✨ Use Cases
- LLM powered results analysis
- Effective debugging with the help of LLM
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
Server Config
{
"mcpServers": {
"locust-mcp-server": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"ghcr.io/metorial/mcp-container--qainsights--locust-mcp-server--locust-mcp-server",
"python main.py"
],
"env": {}
}
}
}