MCP Databricks Tiny

MCP Databricks Tiny image

MCP Databricks Tiny connects Codex and other MCP-aware AI tools to Azure Databricks SQL Warehouses with safer, simpler read-only access and practical built-in guardrails.

Description

MCP Databricks Tiny is a small, local-first MCP server that helps you connect Codex and other MCP-aware AI tools to Azure Databricks SQL Warehouses with a safer, simpler setup. The project is build for practical local development, planning and testing.

It provides controlled schema discovery and read-only SQL access for inspecting catalogs, schemas, and tables, with guardrails like row caps, identifier validation, environment-based configuration, optional whitelists, and a warmup path for cold warehouses.

This project is built for developers who want a simple, practical bridge between AI coding agents and Azure Databricks SQL Warehouses. It keeps the workflow simple for local development while focusing on the features that matter most: safe read-only querying, schema and table inspection, configurable guardrails, lightweight setup, and fast diagnostics when a warehouse is cold or configuration is incomplete.

Features

* Local-first MCP server over **stdio** for Codex and other MCP-aware tools
* Azure Databricks SQL Warehouse connectivity via **databricks-sql-connector**
* Safe, **read-only SQL execution** with prefix checks and blocked write/admin keywords
* Schema exploration tools for namespaces, schemas, tables, and table descriptions
* Configurable row caps for controlled previews and query results
* Identifier validation to reduce injection risk in lookup-style tools
* Optional catalog and schema whitelists to narrow accessible data
* Repo-local `.env` configuration with lightweight smoke tests
* `healthcheck()` and `warmup()` helpers for startup diagnostics and cold warehouses

What This Is NOT

* Not a full Databricks admin tool
* Not a replacement for Databricks permissions or governance
* Does not manage jobs, clusters, permissions, or writes
* Does not implement OAuth flows internally

Get It

View project and clone from Github here –
https://github.com/compkick/mcp-databricks-tiny

Requirements

* **Python 3.11+**
* Developed against Python 3.14.2
* Windows (tested), macOS/Linux should also work
* Access to an **Azure Databricks SQL Warehouse**
* Codex (or other MCP-aware AI coding agent) with MCP support enabled

Quickstart

For quickstart, setup and testing view, the Readme.md in project files

Codex MCP Configuration

Add this to your Codex config file:

Location

~/.codex/config.toml

Entry

[mcp_servers.databricks_tiny]
command = 'PROJECT_DIR\mcp-databricks-tiny\.venv\Scripts\python.exe'
args = ['PROJECT_DIR\mcp-databricks-tiny\databricks_mcp_server.py']

Replace PROJECT_DIR with the full path to the parent folder where this repo lives on your machine. For example, if the repo is in C:\Dev\Projects, the paths above should point to C:\Dev\Projects\mcp-databricks-tiny\....

The Databricks connection values live in the MCP server repo’s local .env file, not in Codex config.toml.

If you prefer the system Python, change command back to python and ensure the dependencies are installed globally.

Restart Codex after editing.

Changelog

This project uses semantic versioning. `0.1.1` is the current public release.

### 0.1.1

Documentation-focused update for the Databricks Tiny MCP Server.

Included in this release:

* README documentation refinements for public-facing usage and setup guidance
* License URL correction in the README
* Changelog, highlights, features, and Codex MCP configuration wording improvements
 
### 0.1.0
 
First public release of the Databricks Tiny MCP Server.

Included in this release:

* Core MCP tools for `ping`, `discover_namespace`, `list_schemas`, `list_tables`, `describe_table`, `run_query_preview`, and `run_query_readonly`
* Read-only SQL enforcement with row caps and optional catalog/schema allowlists
* `warmup()` and `healthcheck()` helpers for cold warehouses and startup diagnostics
* Local `.env` loading via `python-dotenv` and a matching `.env.example`
* Smoke test script via `python test_server.py`
* Pytest coverage for local safety and configuration logic
* README cleanup for public usage, setup, and Codex MCP configuration
* MIT licensing and versioned dependency ranges in `requirements.txt`
* Identifier validation hardening for schema/table lookup tools

License

This project is licensed under the MIT License. You are free to use, modify, and distribute the software under the terms of the included license.

The software is provided “as is”, without warranty of any kind, and the authors are not liable for any claims, damages, or other liability arising from its use.

Future Roadmap

Possible future enhancements include:

* `export_query_jsonl(sql, path)`
* `export_query_parquet(sql, path)`
* Python package artifacts (`pyproject.toml`, wheel, source distribution)
* Structured logging
* Containerization (Docker)
* OAuth token refresh helper

Support

See project documentation in the repository README.md, which covers installation, setup and usage.

For implementation questions, issue reporting, or enhancement requests, use the GitHub issue tracker:
GitHub: https://github.com/compkick/mcp-databricks-tiny/issues

For other questions, use our contact form to get in touch.