Prometheus Mcp Server
A Model Context Protocol (MCP) server that allows AI assistants to request and examine Prometheus metrics using standardized interfaces.
Overview
What is Prometheus MCP Server?
The ### Prometheus MCP Server is a Model Context Protocol (MCP) server designed to help AI assistants in querying and analyzing Prometheus metrics through standardized interfaces. This server serves as a bridge, enabling smooth interaction between AI systems and the extensive data provided by Prometheus, a powerful monitoring and alerting toolkit commonly used in cloud-native environments.
Features of Prometheus MCP Server
- Standardized Interfaces: The server offers a set of standardized APIs that allow AI assistants to easily access and manipulate Prometheus metrics.
- Real-time Data Access: Users can query metrics in real-time, providing immediate insights and facilitating decision-making.
- Integration Friendly: Built to integrate seamlessly with existing AI systems and Prometheus setups, enhancing overall functionality without needing extensive changes.
- Open Source: As a public repository, it encourages community contributions and improvements, promoting a collaborative development environment.
- Scalability: The architecture supports scaling, making it suitable for both small and large deployments.
How to Use Prometheus MCP Server
- Installation: Clone the repository from GitHub and follow the installation instructions provided in the documentation.
- Configuration: Set up the server by configuring the necessary parameters to connect to your Prometheus instance.
- API Access: Use the provided APIs to send queries from your AI assistant to the Prometheus MCP Server.
- Data Analysis: Analyze the returned metrics and incorporate them into your AI workflows for improved decision-making capabilities.
- Community Support: Engage with the community for support, feature requests, and contributions to the project.
Frequently Asked Questions
What is the purpose of the Prometheus MCP Server?
The main purpose of the Prometheus MCP Server is to enable AI assistants to query and analyze Prometheus metrics through standardized interfaces, enhancing AI capabilities in data-driven environments.
Is the Prometheus MCP Server open source?
Yes, the Prometheus MCP Server is an open-source project, allowing users to contribute to its development and enhancement.
How can I contribute to the Prometheus MCP Server?
You can contribute by forking the repository, making improvements, and submitting pull requests. Additionally, you can report issues or suggest features through the GitHub issues page.
What are the system requirements for running the Prometheus MCP Server?
The server requires a compatible environment with access to a Prometheus instance. Specific requirements can be found in the documentation within the repository.
Can I use the Prometheus MCP Server with other monitoring tools?
While it is primarily designed for Prometheus, the architecture allows for potential integration with other monitoring tools, depending on the specific use case and requirements.
Details
Prometheus MCP Server
A Model Context Protocol (MCP) server for Prometheus.
This provides access to your Prometheus metrics and queries through standardized MCP interfaces, allowing AI assistants to execute PromQL queries and analyze your metrics data.
<a href="https://glama.ai/mcp/servers/@pab1it0/prometheus-mcp-server"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@pab1it0/prometheus-mcp-server/badge" alt="Prometheus Server MCP server" /> </a>Features
-
Execute PromQL queries against Prometheus
-
Discover and explore metrics
- List available metrics
- Get metadata for specific metrics
- View instant query results
- View range query results with different step intervals
-
Authentication support
- Basic auth from environment variables
- Bearer token auth from environment variables
-
Docker containerization support
-
Provide interactive tools for AI assistants
The list of tools is configurable, so you can choose which tools you want to make available to the MCP client. This is useful if you don't use certain functionality or if you don't want to take up too much of the context window.
Usage
-
Ensure your Prometheus server is accessible from the environment where you'll run this MCP server.
-
Configure the environment variables for your Prometheus server, either through a
.env
file or system environment variables:
### Required: Prometheus configuration
PROMETHEUS_URL=http://your-prometheus-server:9090
### Optional: Authentication credentials (if needed)
### Choose one of the following authentication methods if required:
### For basic auth
PROMETHEUS_USERNAME=your_username
PROMETHEUS_PASSWORD=your_password
### For bearer token auth
PROMETHEUS_TOKEN=your_token
### Optional: For multi-tenant setups like Cortex, Mimir or Thanos
ORG_ID=your_organization_id
- Add the server configuration to your client configuration file. For example, for Claude Desktop:
{
"mcpServers": {
"prometheus": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"PROMETHEUS_URL",
"ghcr.io/pab1it0/prometheus-mcp-server:latest"
],
"env": {
"PROMETHEUS_URL": "<url>"
}
}
}
}
Development
Contributions are welcome! Please open an issue or submit a pull request if you have any suggestions or improvements.
This project uses uv
to manage dependencies. Install uv
following the instructions for your platform:
curl -LsSf https://astral.sh/uv/install.sh | sh
You can then create a virtual environment and install the dependencies with:
uv venv
source .venv/bin/activate # On Unix/macOS
.venv\Scripts\activate # On Windows
uv pip install -e .
Project Structure
The project has been organized with a src
directory structure:
prometheus-mcp-server/
├── src/
│ └── prometheus_mcp_server/
│ ├── __init__.py # Package initialization
│ ├── server.py # MCP server implementation
│ ├── main.py # Main application logic
├── Dockerfile # Docker configuration
├── docker-compose.yml # Docker Compose configuration
├── .dockerignore # Docker ignore file
├── pyproject.toml # Project configuration
└── README.md # This file
Testing
The project includes a comprehensive test suite that ensures functionality and helps prevent regressions.
Run the tests with pytest:
### Install development dependencies
uv pip install -e ".[dev]"
### Run the tests
pytest
### Run with coverage report
pytest --cov=src --cov-report=term-missing
Tests are organized into:
- Configuration validation tests
- Server functionality tests
- Error handling tests
- Main application tests
When adding new features, please also add corresponding tests.
Tools
| Tool | Category | Description |
| | | |
| execute_query
| Query | Execute a PromQL instant query against Prometheus |
| execute_range_query
| Query | Execute a PromQL range query with start time, end time, and step interval |
| list_metrics
| Discovery | List all available metrics in Prometheus |
| get_metric_metadata
| Discovery | Get metadata for a specific metric |
| get_targets
| Discovery | Get information about all scrape targets |
License
MIT
Server Config
{
"mcpServers": {
"prometheus-mcp-server": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"ghcr.io/metorial/mcp-container--pab1it0--prometheus-mcp-server--prometheus-mcp-server",
"prometheus-mcp-server"
],
"env": {
"PROMETHEUS_URL": "prometheus-url",
"PROMETHEUS_USERNAME": "prometheus-username",
"PROMETHEUS_PASSWORD": "prometheus-password"
}
}
}
}