
Overview: Resumes warehouse through Snowflake.
Benefits:
Common Use Cases:
1. Automated Data Pipeline Orchestration
AI agents create and monitor Snowflake tasks and streams, ensuring data pipelines process new data on schedule and alerting teams to failures or delays.
2. Self-Service Analytics & Reporting
AI agents query Snowflake tables on demand to generate business intelligence reports, dashboards, and ad-hoc analyses for stakeholders across the organization.
3. Data Sharing & Marketplace Integration
AI agents manage Snowflake data shares and marketplace listings, automating access provisioning and monitoring usage across partner and internal consumers.
4. Cost & Warehouse Management
AI agents monitor Snowflake warehouse usage and credit consumption, auto-suspend idle warehouses, and generate cost reports to optimize compute spend.
5. Data Quality & Governance
AI agents run data quality checks on Snowflake tables, track schema changes, enforce access policies, and generate compliance reports for data governance teams.

Lists databases in account.

Suspends task.

Gets task.

LIST @stage_path via SQL.

Creates stage.

Runs SHOW GRANTS TO ROLE via SQL API.

Suspends warehouse.

Gets stage.

Creates database (body per Snowflake REST schema).

POST /statements — runs SQL with optional warehouse, database, schema, role, timeout, bindings JSON. Returns resultSetMetaData and data rows. Use for any Snowflake SQL.

Creates user.

Creates pipe.

Lists pipes.

Resumes pipe.

Gets stream.

Gets table.

Lists roles.

Drops stage.

Drops warehouse.

Creates warehouse.

Runs async statement then polls until terminal state or timeout.

DELETE /statements/{handle} to cancel.

Creates stream.

Lists warehouses.

Drops user.

Creates table via JSON body.

Drops stream.

Runs SHOW CREATE TABLE via SF_EXECUTE_SQL preset.

Lists users.

Creates task.

Async statement — returns handle to poll with SF_GET_SQL_RESULT.

Pauses pipe.

Executes task (server-specific).

Resumes task.

Gets warehouse.

Lists tables in schema.

Lists tasks.

GET /statements/{handle} for async results.

DESCRIBE WAREHOUSE via SQL.

Documents PUT command — run via Snowflake CLI; here runs SQL note only.

Lists stages in schema.

Drops table.

Runs GRANT via SQL statement tool alias.

Deletes database.

Creates schema.

Gets database by name.

Gets schema.

Creates role.

Lists streams.

Gets user.

Lists schemas in database.

Drops schema.

Gets role.

Gets pipe.
Do I need my own developer credentials to use Snowflake MCP with Adopt AI?
No, you can get started immediately using Adopt AI's built-in Snowflake integration. For production use, we recommend configuring your own API credentials for greater control and security.
Can I connect Snowflake with other apps through Adopt AI?
Yes! Adopt AI supports multi-app workflows, so your AI agents can seamlessly move data between Snowflake and BI tools, data pipelines, cloud platforms, and more.
Is Adopt AI secure?
Absolutely. Adopt AI is SOC 2 Type 2 certified and ISO/IEC 27001 compliant, and adheres to EU GDPR, CCPA, and HIPAA standards. All data is encrypted in transit and at rest, ensuring the confidentiality, integrity, and availability of your data. Learn more here.
What happens if the Snowflake API changes?
Adopt AI maintains and updates all integrations automatically, so your agents always work with the latest API versions, no manual maintenance required.
Do I need coding skills to set up the Snowflake integration?
Not at all. Adopt AI's zero-shot API discovery means your agents understand Snowflake's schema on first contact. Setup takes minutes with no code required.
How do I set up custom Snowflake MCP in Adopt AI?
For a step-by-step guide on creating and configuring your own Snowflake API credentials with Adopt AI, see here.