
Overview: Get catalog through Databricks.
Benefits:
Common Use Cases:
1. Automated Data Pipeline Orchestration
AI agents create, schedule, and monitor Databricks jobs and workflows, ensuring data pipelines run on time and alerting teams to failures or delays.
2. Notebook-Based Analysis Automation
AI agents execute Databricks notebooks on demand, passing parameters and collecting results to power automated reporting and ad-hoc analytics workflows.
3. ML Model Training & Deployment
AI agents manage end-to-end ML workflows in Databricks � from data prep and feature engineering to model training, evaluation, and deployment via MLflow.
4. Data Quality Monitoring
AI agents run data quality checks on Databricks tables, flag anomalies, missing values, and schema drift, and generate data health reports for engineering teams.
5. Cost & Cluster Management
AI agents monitor Databricks cluster usage and costs, auto-scale or terminate idle clusters, and generate spend reports to optimize cloud compute budgets.

Query history.

Gets run metadata (GET 2.1).

Get table.

Triggers a run.

Stops warehouse.

POST statement then poll until SUCCEEDED/FAILED/CANCELED or timeout.

Delete run data.

List volumes.

Lists runs for job.

Lists instance pools.

Stop pipeline.

Delete secret.

List tables.

Put secret.

Lists jobs.

Repairs a failed run.

Creates pool.

Delete schema.

Lists secret scopes.

Lists node types.

Deletes pipeline.

Updates pipeline.

Creates cluster (pass body_json for full spec).

Cancel SQL statement.

Lists workspace path.

Export notebook.

Metrics query preview (body_json).

Creates job (use body_json).

Starts cluster.

File status.

Deletes SQL warehouse.

Pin cluster.

List functions.

Triggers job then polls run until terminal.

List instance profiles.

Move DBFS.

Delete workspace object.

Gets notebook/task output.

Spark conf for cluster.

Update repo.

POST invocations to serving endpoint.

Lists all clusters.

Run output v2.1.

Gets permissions.

SCIM me (preview).

Uninstall from cluster.

Gets endpoint.

List repos.

Restart cluster.

Create PAT (body_json).

List SQL alerts.

Policy families preview.

Lists SQL warehouses.

Pipeline events.

Gets pool.

All cluster library statuses.

Cluster events.

Alias current user.

Serving logs.

Lists DLT pipelines.

Lists Spark versions.

Starts warehouse.

Updates ACLs (body_json).

Start update.

Delete path.

Deletes job.

Lists DBFS path.

Submit one-off run (body_json).

List git credentials.

GET statement status/result by id.

Edit cluster config.

GET global init scripts.

Gets pipeline.

MLflow experiments.

Feature store tables (if enabled).

MLflow registered models.

Creates pipeline.

Terminates cluster.

Create schema.

Permanent delete cluster.

Delete scope.

Create scope.

Lists model serving endpoints.

List tokens.

Gets SQL warehouse by id.

Read file chunk.

Revoke token.

Get schema.

Gets job definition.

Create repo.

Install on cluster.

List schemas.

Create catalog.

Edits pool.

Make directories.

Path status.

Cancels a run.

Delete table.

Put secret ACL.

List dashboards preview.
Do I need my own developer credentials to use Databricks MCP with Adopt AI?
No, you can get started immediately using Adopt AI's built-in Databricks integration. For production use, we recommend configuring your own API tokens for greater control and security.
Can I connect Databricks with other apps through Adopt AI?
Yes! Adopt AI supports multi-app workflows, so your AI agents can seamlessly move data between Databricks and data warehouses, BI tools, cloud platforms, and more.
Is Adopt AI secure?
Absolutely. Adopt AI is SOC 2 Type 2 certified and ISO/IEC 27001 compliant, and adheres to EU GDPR, CCPA, and HIPAA standards. All data is encrypted in transit and at rest, ensuring the confidentiality, integrity, and availability of your data. Learn more here.
What happens if the Databricks API changes?
Adopt AI maintains and updates all integrations automatically, so your agents always work with the latest API versions, no manual maintenance required.
Do I need coding skills to set up the Databricks integration?
Not at all. Adopt AI's zero-shot API discovery means your agents understand Databricks's schema on first contact. Setup takes minutes with no code required.
How do I set up custom Databricks MCP in Adopt AI?
For a step-by-step guide on creating and configuring your own Databricks API tokens with Adopt AI, see here.