An MCP server that executes SQL via ConnectorX and streams the result to CSV or Parquet in PyArrow RecordBatch chunks. Output formats: csv or parquet CSV: UTF-8, header row is always written Parquet: PyArrow defaults; schema mismatch across batches raises an error Return value: the string "OK" on success, or "Error: " on failure On failure the partially written output file is deleted CSV token cou
Add this skill
npx mdskills install gigamori/mcp-run-sql-connectorxWell-documented MCP server with clear tool specs, streaming architecture, and robust error handling
An MCP server that executes SQL via ConnectorX and streams the result to CSV or Parquet in
PyArrow RecordBatch chunks.
csv or parquet"OK" on success, or "Error: " on failuretiktoken (o200k_base) with a warning thresholdRecordBatch chunksConnectorX supports many databases. Common examples include:
For the complete and up-to-date list of supported databases and connection-token (conn) formats, see the official docs:
uvx run-sql-connectorx \
--conn "" \
--csv-token-threshold 500000
is the connection token (conn) used by ConnectorX—SQLite, PostgreSQL, BigQuery, and more.
--conn (required): ConnectorX connection token (conn)--csv-token-threshold (default 0): when > 0, enable CSV per-line token counting using tiktoken(o200k_base); the value is a warning thresholdTo launch the server from an MCP-aware client such as Cursor, add the following snippet to
.cursor/mcp.json at the project root:
{
"mcpServers": {
"run-sql-connectorx": {
"command": "uvx",
"args": [
"--from", "git+https://github.com/gigamori/mcp-run-sql-connectorx",
"run-sql-connectorx",
"--conn", ""
]
}
}
}
batch_size is 100 000 rows.--csv-token-threshold > 0):
csv.writer writes (including header row when present, delimiters, quotes, and newlines), UTF-8tiktoken(o200k_base) per written CSV lineThe tool returns a single text message.
OK--csv-token-threshold = 0: OK--csv-token-threshold > 0: OK N tokens (or OK N tokens. Too many tokens may impair processing. Handle appropriately when N >= threshold)OK 0 tokensError: (any partial output file is deleted)The server exposes a single MCP tool run_sql.
| Argument | Type | Required | Description |
|---|---|---|---|
sql_file | string | yes | Path to a file that contains the SQL text to execute |
output_path | string | yes | Destination file for the query result |
output_format | enum | yes | One of "csv" or "parquet" |
batch_size | int | no | RecordBatch size (default 100000) |
{
"tool": "run_sql",
"arguments": {
"sql_file": "sql/queries/sales.sql",
"output_path": "output/sales.parquet",
"output_format": "parquet",
"batch_size": 200000
}
}
Distributed under the MIT License. See LICENSE for details.
Install via CLI
npx mdskills install gigamori/mcp-run-sql-connectorxRun SQL Connectorx is a free, open-source AI agent skill. An MCP server that executes SQL via ConnectorX and streams the result to CSV or Parquet in PyArrow RecordBatch chunks. Output formats: csv or parquet CSV: UTF-8, header row is always written Parquet: PyArrow defaults; schema mismatch across batches raises an error Return value: the string "OK" on success, or "Error: " on failure On failure the partially written output file is deleted CSV token cou
Install Run SQL Connectorx with a single command:
npx mdskills install gigamori/mcp-run-sql-connectorxThis downloads the skill files into your project and your AI agent picks them up automatically.
Run SQL Connectorx works with Claude Code, Claude Desktop, Cursor, Vscode Copilot, Windsurf, Continue Dev, Gemini Cli, Amp, Roo Code, Goose. Skills use the open SKILL.md format which is compatible with any AI coding agent that reads markdown instructions.