A Model Context Protocol (MCP) server implementation for Apache Airflow, enabling seamless integration with MCP clients. This project provides a standardized way to interact with Apache Airflow through the Model Context Protocol. This project implements a Model Context Protocol server that wraps Apache Airflow's REST API, allowing MCP clients to interact with Airflow in a standardized way. It uses
Add this skill
npx mdskills install yangkyeongmo/mcp-server-apache-airflowComprehensive MCP server wrapping Apache Airflow's REST API with extensive feature coverage
1[](https://mseep.ai/app/yangkyeongmo-mcp-server-apache-airflow)23# mcp-server-apache-airflow45[](https://smithery.ai/server/@yangkyeongmo/mcp-server-apache-airflow)678A Model Context Protocol (MCP) server implementation for Apache Airflow, enabling seamless integration with MCP clients. This project provides a standardized way to interact with Apache Airflow through the Model Context Protocol.910<a href="https://glama.ai/mcp/servers/e99b6vx9lw">11 <img width="380" height="200" src="https://glama.ai/mcp/servers/e99b6vx9lw/badge" alt="Server for Apache Airflow MCP server" />12</a>1314## About1516This project implements a [Model Context Protocol](https://modelcontextprotocol.io/introduction) server that wraps Apache Airflow's REST API, allowing MCP clients to interact with Airflow in a standardized way. It uses the official Apache Airflow client library to ensure compatibility and maintainability.1718## Feature Implementation Status1920| Feature | API Path | Status |21| -------------------------------- | --------------------------------------------------------------------------------------------- | ------ |22| **DAG Management** | | |23| List DAGs | `/api/v1/dags` | ✅ |24| Get DAG Details | `/api/v1/dags/{dag_id}` | ✅ |25| Pause DAG | `/api/v1/dags/{dag_id}` | ✅ |26| Unpause DAG | `/api/v1/dags/{dag_id}` | ✅ |27| Update DAG | `/api/v1/dags/{dag_id}` | ✅ |28| Delete DAG | `/api/v1/dags/{dag_id}` | ✅ |29| Get DAG Source | `/api/v1/dagSources/{file_token}` | ✅ |30| Patch Multiple DAGs | `/api/v1/dags` | ✅ |31| Reparse DAG File | `/api/v1/dagSources/{file_token}/reparse` | ✅ |32| **DAG Runs** | | |33| List DAG Runs | `/api/v1/dags/{dag_id}/dagRuns` | ✅ |34| Create DAG Run | `/api/v1/dags/{dag_id}/dagRuns` | ✅ |35| Get DAG Run Details | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` | ✅ |36| Update DAG Run | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` | ✅ |37| Delete DAG Run | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` | ✅ |38| Get DAG Runs Batch | `/api/v1/dags/~/dagRuns/list` | ✅ |39| Clear DAG Run | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/clear` | ✅ |40| Set DAG Run Note | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/setNote` | ✅ |41| Get Upstream Dataset Events | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/upstreamDatasetEvents` | ✅ |42| **Tasks** | | |43| List DAG Tasks | `/api/v1/dags/{dag_id}/tasks` | ✅ |44| Get Task Details | `/api/v1/dags/{dag_id}/tasks/{task_id}` | ✅ |45| Get Task Instance | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}` | ✅ |46| List Task Instances | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances` | ✅ |47| Update Task Instance | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}` | ✅ |48| Get Task Instance Log | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/logs/{task_try_number}` | ✅ |49| Clear Task Instances | `/api/v1/dags/{dag_id}/clearTaskInstances` | ✅ |50| Set Task Instances State | `/api/v1/dags/{dag_id}/updateTaskInstancesState` | ✅ |51| List Task Instance Tries | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/tries` | ✅ |52| **Variables** | | |53| List Variables | `/api/v1/variables` | ✅ |54| Create Variable | `/api/v1/variables` | ✅ |55| Get Variable | `/api/v1/variables/{variable_key}` | ✅ |56| Update Variable | `/api/v1/variables/{variable_key}` | ✅ |57| Delete Variable | `/api/v1/variables/{variable_key}` | ✅ |58| **Connections** | | |59| List Connections | `/api/v1/connections` | ✅ |60| Create Connection | `/api/v1/connections` | ✅ |61| Get Connection | `/api/v1/connections/{connection_id}` | ✅ |62| Update Connection | `/api/v1/connections/{connection_id}` | ✅ |63| Delete Connection | `/api/v1/connections/{connection_id}` | ✅ |64| Test Connection | `/api/v1/connections/test` | ✅ |65| **Pools** | | |66| List Pools | `/api/v1/pools` | ✅ |67| Create Pool | `/api/v1/pools` | ✅ |68| Get Pool | `/api/v1/pools/{pool_name}` | ✅ |69| Update Pool | `/api/v1/pools/{pool_name}` | ✅ |70| Delete Pool | `/api/v1/pools/{pool_name}` | ✅ |71| **XComs** | | |72| List XComs | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries` | ✅ |73| Get XCom Entry | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key}` | ✅ |74| **Datasets** | | |75| List Datasets | `/api/v1/datasets` | ✅ |76| Get Dataset | `/api/v1/datasets/{uri}` | ✅ |77| Get Dataset Events | `/api/v1/datasetEvents` | ✅ |78| Create Dataset Event | `/api/v1/datasetEvents` | ✅ |79| Get DAG Dataset Queued Event | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri}` | ✅ |80| Get DAG Dataset Queued Events | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents` | ✅ |81| Delete DAG Dataset Queued Event | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri}` | ✅ |82| Delete DAG Dataset Queued Events | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents` | ✅ |83| Get Dataset Queued Events | `/api/v1/datasets/{uri}/dagRuns/queued/datasetEvents` | ✅ |84| Delete Dataset Queued Events | `/api/v1/datasets/{uri}/dagRuns/queued/datasetEvents` | ✅ |85| **Monitoring** | | |86| Get Health | `/api/v1/health` | ✅ |87| **DAG Stats** | | |88| Get DAG Stats | `/api/v1/dags/statistics` | ✅ |89| **Config** | | |90| Get Config | `/api/v1/config` | ✅ |91| **Plugins** | | |92| Get Plugins | `/api/v1/plugins` | ✅ |93| **Providers** | | |94| List Providers | `/api/v1/providers` | ✅ |95| **Event Logs** | | |96| List Event Logs | `/api/v1/eventLogs` | ✅ |97| Get Event Log | `/api/v1/eventLogs/{event_log_id}` | ✅ |98| **System** | | |99| Get Import Errors | `/api/v1/importErrors` | ✅ |100| Get Import Error Details | `/api/v1/importErrors/{import_error_id}` | ✅ |101| Get Health Status | `/api/v1/health` | ✅ |102| Get Version | `/api/v1/version` | ✅ |103104## Setup105106### Dependencies107108This project depends on the official Apache Airflow client library (`apache-airflow-client`). It will be automatically installed when you install this package.109110### Environment Variables111112Set the following environment variables:113114```115AIRFLOW_HOST=<your-airflow-host> # Optional, defaults to http://localhost:8080116AIRFLOW_API_VERSION=v1 # Optional, defaults to v1117READ_ONLY=true # Optional, enables read-only mode (true/false, defaults to false)118```119120#### Authentication121122Choose one of the following authentication methods:123124**Basic Authentication (default):**125```126AIRFLOW_USERNAME=<your-airflow-username>127AIRFLOW_PASSWORD=<your-airflow-password>128```129130**JWT Token Authentication:**131```132AIRFLOW_JWT_TOKEN=<your-jwt-token>133```134135To obtain a JWT token, you can use Airflow's authentication endpoint:136137```bash138ENDPOINT_URL="http://localhost:8080" # Replace with your Airflow endpoint139curl -X 'POST' \140 "${ENDPOINT_URL}/auth/token" \141 -H 'Content-Type: application/json' \142 -d '{ "username": "<your-username>", "password": "<your-password>" }'143```144145> **Note**: If both JWT token and basic authentication credentials are provided, JWT token takes precedence.146147### Usage with Claude Desktop148149Add to your `claude_desktop_config.json`:150151**Basic Authentication:**152```json153{154 "mcpServers": {155 "mcp-server-apache-airflow": {156 "command": "uvx",157 "args": ["mcp-server-apache-airflow"],158 "env": {159 "AIRFLOW_HOST": "https://your-airflow-host",160 "AIRFLOW_USERNAME": "your-username",161 "AIRFLOW_PASSWORD": "your-password"162 }163 }164 }165}166```167168**JWT Token Authentication:**169```json170{171 "mcpServers": {172 "mcp-server-apache-airflow": {173 "command": "uvx",174 "args": ["mcp-server-apache-airflow"],175 "env": {176 "AIRFLOW_HOST": "https://your-airflow-host",177 "AIRFLOW_JWT_TOKEN": "your-jwt-token"178 }179 }180 }181}182```183184For read-only mode (recommended for safety):185186**Basic Authentication:**187```json188{189 "mcpServers": {190 "mcp-server-apache-airflow": {191 "command": "uvx",192 "args": ["mcp-server-apache-airflow"],193 "env": {194 "AIRFLOW_HOST": "https://your-airflow-host",195 "AIRFLOW_USERNAME": "your-username",196 "AIRFLOW_PASSWORD": "your-password",197 "READ_ONLY": "true"198 }199 }200 }201}202```203204**JWT Token Authentication:**205```json206{207 "mcpServers": {208 "mcp-server-apache-airflow": {209 "command": "uvx",210 "args": ["mcp-server-apache-airflow", "--read-only"],211 "env": {212 "AIRFLOW_HOST": "https://your-airflow-host",213 "AIRFLOW_JWT_TOKEN": "your-jwt-token"214 }215 }216 }217}218```219220Alternative configuration using `uv`:221222**Basic Authentication:**223```json224{225 "mcpServers": {226 "mcp-server-apache-airflow": {227 "command": "uv",228 "args": [229 "--directory",230 "/path/to/mcp-server-apache-airflow",231 "run",232 "mcp-server-apache-airflow"233 ],234 "env": {235 "AIRFLOW_HOST": "https://your-airflow-host",236 "AIRFLOW_USERNAME": "your-username",237 "AIRFLOW_PASSWORD": "your-password"238 }239 }240 }241}242```243244**JWT Token Authentication:**245```json246{247 "mcpServers": {248 "mcp-server-apache-airflow": {249 "command": "uv",250 "args": [251 "--directory",252 "/path/to/mcp-server-apache-airflow",253 "run",254 "mcp-server-apache-airflow"255 ],256 "env": {257 "AIRFLOW_HOST": "https://your-airflow-host",258 "AIRFLOW_JWT_TOKEN": "your-jwt-token"259 }260 }261 }262}263```264265Replace `/path/to/mcp-server-apache-airflow` with the actual path where you've cloned the repository.266267### Selecting the API groups268269You can select the API groups you want to use by setting the `--apis` flag.270271```bash272uv run mcp-server-apache-airflow --apis dag --apis dagrun273```274275The default is to use all APIs.276277Allowed values are:278279- config280- connections281- dag282- dagrun283- dagstats284- dataset285- eventlog286- importerror287- monitoring288- plugin289- pool290- provider291- taskinstance292- variable293- xcom294295### Read-Only Mode296297You can run the server in read-only mode by using the `--read-only` flag or by setting the `READ_ONLY=true` environment variable. This will only expose tools that perform read operations (GET requests) and exclude any tools that create, update, or delete resources.298299Using the command-line flag:300```bash301uv run mcp-server-apache-airflow --read-only302```303304Using the environment variable:305```bash306READ_ONLY=true uv run mcp-server-apache-airflow307```308309In read-only mode, the server will only expose tools like:310- Listing DAGs, DAG runs, tasks, variables, connections, etc.311- Getting details of specific resources312- Reading configurations and monitoring information313- Testing connections (non-destructive)314315Write operations like creating, updating, deleting DAGs, variables, connections, triggering DAG runs, etc. will not be available in read-only mode.316317You can combine read-only mode with API group selection:318319```bash320uv run mcp-server-apache-airflow --read-only --apis dag --apis variable321```322323### Manual Execution324325You can also run the server manually:326327```bash328make run329```330331`make run` accepts following options:332333Options:334335- `--port`: Port to listen on for SSE (default: 8000)336- `--transport`: Transport type (stdio/sse/http, default: stdio)337338Or, you could run the sse server directly, which accepts same parameters:339340```bash341make run-sse342```343344Also, you could start service directly using `uv` like in the following command:345346```bash347uv run src --transport http --port 8080348```349350### Installing via Smithery351352To install Apache Airflow MCP Server for Claude Desktop automatically via [Smithery](https://smithery.ai/server/@yangkyeongmo/mcp-server-apache-airflow):353354```bash355npx -y @smithery/cli install @yangkyeongmo/mcp-server-apache-airflow --client claude356```357358## Development359360### Setting up Development Environment3613621. Clone the repository:363```bash364git clone https://github.com/yangkyeongmo/mcp-server-apache-airflow.git365cd mcp-server-apache-airflow366```3673682. Install development dependencies:369```bash370uv sync --dev371```3723733. Create a `.env` file for environment variables (optional for development):374```bash375touch .env376```377378> **Note**: No environment variables are required for running tests. The `AIRFLOW_HOST` defaults to `http://localhost:8080` for development and testing purposes.379380### Running Tests381382The project uses pytest for testing with the following commands available:383384```bash385# Run all tests386make test387```388389### Code Quality390391```bash392# Run linting393make lint394395# Run code formatting396make format397```398399### Continuous Integration400401The project includes a GitHub Actions workflow (`.github/workflows/test.yml`) that automatically:402403- Runs tests on Python 3.10, 3.11, and 3.12404- Executes linting checks using ruff405- Runs on every push and pull request to `main` branch406407The CI pipeline ensures code quality and compatibility across supported Python versions before any changes are merged.408409## Contributing410411Contributions are welcome! Please feel free to submit a Pull Request.412413The package is deployed automatically to PyPI when project.version is updated in `pyproject.toml`.414Follow semver for versioning.415416Please include version update in the PR in order to apply the changes to core logic.417418## License419420[MIT License](LICENSE)421
Full transparency — inspect the skill content before installing.