BigQuery
/ tool

Get Routine on BigQuery

Gets routine definition.

Generate MCP URL

Use Case

Overview: Gets routine definition through BigQuery.

Benefits:

  • Automate data pipeline orchestration and analytics.
  • Scale data processing with cloud data warehouse power.

Common Use Cases:

  • Run scheduled SQL queries for business reporting.
  • Transform raw data into analytics-ready datasets.

Use Cases for BigQuery MCP

1. Automated Reporting & Dashboards

AI agents run scheduled and ad-hoc queries against BigQuery datasets to generate business intelligence reports, KPI dashboards, and executive summaries.


2. Data Pipeline Monitoring

AI agents track BigQuery job status, monitor data freshness, and alert teams when scheduled loads fail or data quality thresholds are breached.


3. Cost & Query Optimization

AI agents analyze BigQuery query patterns, identify expensive or inefficient queries, recommend partitioning and clustering strategies, and generate cost reports.


4. Cross-Platform Data Integration

AI agents sync data between BigQuery and CRMs, marketing tools, and databases, keeping analytics datasets current and eliminating manual ETL work.


5. ML Feature Store & Analytics

AI agents query BigQuery to compute ML features, run statistical analyses, and feed results into model training pipelines and experimentation frameworks.

Explore Other Tools

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Load Table From GCS on BigQuery

Starts load job from GCS URIs into a table (configuration.load).

Dry Run Query on BigQuery

Runs query with dryRun true to estimate bytes processed (totalBytesProcessed) before billing.

Delete Routine on BigQuery

Deletes a routine.

List Rows on BigQuery

Lists stored table data (tabledata.list) with pagination.

Get Project on BigQuery

Gets project resource.

List Jobs on BigQuery

Lists jobs in a project with optional state filter, projection, pagination.

Update Table on BigQuery

PATCH table resource.

List Models on BigQuery

Lists BQML models in dataset.

Job Wait on BigQuery

Polls BQ_GET_JOB until DONE or timeout for any job id (query/load/extract).

Get Query Results on BigQuery

Fetches result pages for a completed query job id. Use pageToken for pagination. Returns rows and schema.

Query Parameterized on BigQuery

Runs sync query with named positional parameters (queryParameters JSON). Safer than string concat for user input.

Update Dataset on BigQuery

Updates dataset with PATCH semantics (ETag optional via body).

Get Model on BigQuery

Gets BQML model metadata.

Info Schema Columns on BigQuery

Runs INFORMATION_SCHEMA.COLUMNS query in a region to introspect column types. dataset_region e.g. region-us.

Run Query on BigQuery

Runs a synchronous SQL query via jobs.query. Returns rows, schema, jobReference, and totalRows when available. Set useLegacySql=false for standard SQL. Pass queryParameters as JSON array for parameterized queries. Example query: SELECT * FROM `project.dataset.table` LIMIT 100.

List Routines on BigQuery

Lists routines (UDFs/procedures) in dataset.

Get Table on BigQuery

Gets table metadata including schema.

Update Routine on BigQuery

PATCH an existing routine with partial JSON body.

Insert Rows on BigQuery

Streaming insert rows via tabledata.insertAll. rows_json is JSON array of {insertId?, json:{...}}.

Delete Table on BigQuery

Deletes a table.

Get Job on BigQuery

Gets job metadata and status by job id.

Create Routine on BigQuery

Creates routine from full JSON body.

Copy Table Job on BigQuery

Starts a copy job between tables (configuration.copy).

List Projects on BigQuery

Lists BigQuery projects accessible to the token.

Truncate Table Ddl on BigQuery

Runs TRUNCATE TABLE DDL via async job (BQ_RUN_QUERY_ASYNC). Pass full table id `project.dataset.table`.

Create Table on BigQuery

Creates table via JSON body (schema, timePartitioning, etc.).

List Tables on BigQuery

Lists tables in a dataset.

Run Query Legacy on BigQuery

Runs query with useLegacySql true for legacy SQL only. Prefer BQ_RUN_QUERY with standard SQL.

Delete Dataset on BigQuery

Deletes dataset; use deleteContents true to remove tables.

Get Table Schema on BigQuery

Returns schema fields from BQ_GET_TABLE in a concise shape.

Get Dataset on BigQuery

Gets dataset resource by id.

Export Table on BigQuery

Starts extract job to GCS (configuration.extract). Pass destinationUris JSON array and sourceTable JSON.

Create Dataset on BigQuery

Creates a dataset with datasetReference and optional defaultTableExpirationMs, access, labels.

Delete Model on BigQuery

Deletes a BQML model.

Insert Job on BigQuery

Generic jobs.insert with configuration JSON (load, extract, copy, query). Use when you need full control beyond BQ_RUN_QUERY_ASYNC.

Run Query And Wait on BigQuery

Starts async query job then polls BQ_GET_JOB until done, then returns BQ_GET_QUERY_RESULTS first page. Use for long queries without blocking sync endpoint.

List Datasets on BigQuery

Lists datasets in a project.

Cancel Job on BigQuery

Cancels an incomplete job.

Tabledata Preview on BigQuery

Alias of BQ_LIST_ROWS with maxResults default 100 for quick previews.

Run Query Async on BigQuery

Starts an async query job (jobs.insert). Returns job id and status. Poll with BQ_GET_JOB and fetch rows with BQ_GET_QUERY_RESULTS.

Frequently Asked Questions

Do I need my own developer credentials to use BigQuery MCP with Adopt AI?

No, you can get started immediately using Adopt AI's built-in BigQuery integration. For production use, we recommend configuring your own service account credentials for greater control and security.


Can I connect BigQuery with other apps through Adopt AI?

Yes! Adopt AI supports multi-app workflows, so your AI agents can seamlessly move data between BigQuery and BI tools, data pipelines, cloud platforms, and more.


Is Adopt AI secure?

Absolutely. Adopt AI is SOC 2 Type 2 certified and ISO/IEC 27001 compliant, and adheres to EU GDPR, CCPA, and HIPAA standards. All data is encrypted in transit and at rest, ensuring the confidentiality, integrity, and availability of your data. Learn more here.


What happens if the BigQuery API changes?

Adopt AI maintains and updates all integrations automatically, so your agents always work with the latest API versions, no manual maintenance required.


Do I need coding skills to set up the BigQuery integration?

Not at all. Adopt AI's zero-shot API discovery means your agents understand BigQuery's schema on first contact. Setup takes minutes with no code required.


How do I set up custom BigQuery MCP in Adopt AI?

For a step-by-step guide on creating and configuring your own BigQuery service account credentials with Adopt AI, see here.