A Model Context Protocol (MCP) server implementation for running Locust load tests. This server enables seamless integration of Locust load testing capabilities with AI-powered development environments. - Simple integration with Model Context Protocol framework - Support for headless and UI modes - Configurable test parameters (users, spawn rate, runtime) - Easy-to-use API for running Locust load
Add this skill
npx mdskills install QAInsights/locust-mcp-serverWell-documented MCP server enabling Locust load testing with clear setup and configuration examples
1# ๐ โก๏ธ locust-mcp-server23A Model Context Protocol (MCP) server implementation for running Locust load tests. This server enables seamless integration of Locust load testing capabilities with AI-powered development environments.45## โจ Features67- Simple integration with Model Context Protocol framework8- Support for headless and UI modes9- Configurable test parameters (users, spawn rate, runtime)10- Easy-to-use API for running Locust load tests11- Real-time test execution output12- HTTP/HTTPS protocol support out of the box13- Custom task scenarios support14151617## ๐ง Prerequisites1819Before you begin, ensure you have the following installed:2021- Python 3.13 or higher22- uv package manager ([Installation guide](https://github.com/astral-sh/uv))2324## ๐ฆ Installation25261. Clone the repository:2728```bash29git clone https://github.com/qainsights/locust-mcp-server.git30```31322. Install the required dependencies:3334```bash35uv pip install -r requirements.txt36```37383. Set up environment variables (optional):39 Create a `.env` file in the project root:4041```bash42LOCUST_HOST=http://localhost:8089 # Default host for your tests43LOCUST_USERS=3 # Default number of users44LOCUST_SPAWN_RATE=1 # Default user spawn rate45LOCUST_RUN_TIME=10s # Default test duration46```4748## ๐ Getting Started49501. Create a Locust test script (e.g., `hello.py`):5152```python53from locust import HttpUser, task, between5455class QuickstartUser(HttpUser):56 wait_time = between(1, 5)5758 @task59 def hello_world(self):60 self.client.get("/hello")61 self.client.get("/world")6263 @task(3)64 def view_items(self):65 for item_id in range(10):66 self.client.get(f"/item?id={item_id}", name="/item")67 time.sleep(1)6869 def on_start(self):70 self.client.post("/login", json={"username":"foo", "password":"bar"})71```72732. Configure the MCP server using the below specs in your favorite MCP client (Claude Desktop, Cursor, Windsurf and more):7475```json76{77 "mcpServers": {78 "locust": {79 "command": "/Users/naveenkumar/.local/bin/uv",80 "args": [81 "--directory",82 "/Users/naveenkumar/Gits/locust-mcp-server",83 "run",84 "locust_server.py"85 ]86 }87 }88}89```90913. Now ask the LLM to run the test e.g. `run locust test for hello.py`. The Locust MCP server will use the following tool to start the test:9293- `run_locust`: Run a test with configurable options for headless mode, host, runtime, users, and spawn rate9495## ๐ API Reference9697### Run Locust Test9899```python100run_locust(101 test_file: str,102 headless: bool = True,103 host: str = "http://localhost:8089",104 runtime: str = "10s",105 users: int = 3,106 spawn_rate: int = 1107)108```109110Parameters:111112- `test_file`: Path to your Locust test script113- `headless`: Run in headless mode (True) or with UI (False)114- `host`: Target host to load test115- `runtime`: Test duration (e.g., "30s", "1m", "5m")116- `users`: Number of concurrent users to simulate117- `spawn_rate`: Rate at which users are spawned118119## โจ Use Cases120121- LLM powered results analysis122- Effective debugging with the help of LLM123124## ๐ค Contributing125126Contributions are welcome! Please feel free to submit a Pull Request.127128## ๐ License129130This project is licensed under the MIT License - see the LICENSE file for details.131
Full transparency โ inspect the skill content before installing.