Runs sync query with named positional parameters (queryParameters JSON). Safer than string concat for user input.
Generate MCP URLOverview: Runs sync query with named positional parameters (queryParameters JSON). Safer than string concat for user input through BigQuery.
Benefits:
Common Use Cases:
1. Automated Reporting & Dashboards
AI agents run scheduled and ad-hoc queries against BigQuery datasets to generate business intelligence reports, KPI dashboards, and executive summaries.
2. Data Pipeline Monitoring
AI agents track BigQuery job status, monitor data freshness, and alert teams when scheduled loads fail or data quality thresholds are breached.
3. Cost & Query Optimization
AI agents analyze BigQuery query patterns, identify expensive or inefficient queries, recommend partitioning and clustering strategies, and generate cost reports.
4. Cross-Platform Data Integration
AI agents sync data between BigQuery and CRMs, marketing tools, and databases, keeping analytics datasets current and eliminating manual ETL work.
5. ML Feature Store & Analytics
AI agents query BigQuery to compute ML features, run statistical analyses, and feed results into model training pipelines and experimentation frameworks.
Starts load job from GCS URIs into a table (configuration.load).
Runs query with dryRun true to estimate bytes processed (totalBytesProcessed) before billing.
Deletes a routine.
Lists stored table data (tabledata.list) with pagination.
Gets project resource.
Lists jobs in a project with optional state filter, projection, pagination.
PATCH table resource.
Lists BQML models in dataset.
Polls BQ_GET_JOB until DONE or timeout for any job id (query/load/extract).
Fetches result pages for a completed query job id. Use pageToken for pagination. Returns rows and schema.
Updates dataset with PATCH semantics (ETag optional via body).
Gets BQML model metadata.
Runs INFORMATION_SCHEMA.COLUMNS query in a region to introspect column types. dataset_region e.g. region-us.
Runs a synchronous SQL query via jobs.query. Returns rows, schema, jobReference, and totalRows when available. Set useLegacySql=false for standard SQL. Pass queryParameters as JSON array for parameterized queries. Example query: SELECT * FROM `project.dataset.table` LIMIT 100.
Lists routines (UDFs/procedures) in dataset.
Gets table metadata including schema.
PATCH an existing routine with partial JSON body.
Streaming insert rows via tabledata.insertAll. rows_json is JSON array of {insertId?, json:{...}}.
Deletes a table.
Gets job metadata and status by job id.
Creates routine from full JSON body.
Starts a copy job between tables (configuration.copy).
Lists BigQuery projects accessible to the token.
Runs TRUNCATE TABLE DDL via async job (BQ_RUN_QUERY_ASYNC). Pass full table id `project.dataset.table`.
Creates table via JSON body (schema, timePartitioning, etc.).
Lists tables in a dataset.
Runs query with useLegacySql true for legacy SQL only. Prefer BQ_RUN_QUERY with standard SQL.
Deletes dataset; use deleteContents true to remove tables.
Returns schema fields from BQ_GET_TABLE in a concise shape.
Gets dataset resource by id.
Starts extract job to GCS (configuration.extract). Pass destinationUris JSON array and sourceTable JSON.
Creates a dataset with datasetReference and optional defaultTableExpirationMs, access, labels.
Deletes a BQML model.
Generic jobs.insert with configuration JSON (load, extract, copy, query). Use when you need full control beyond BQ_RUN_QUERY_ASYNC.
Starts async query job then polls BQ_GET_JOB until done, then returns BQ_GET_QUERY_RESULTS first page. Use for long queries without blocking sync endpoint.
Lists datasets in a project.
Cancels an incomplete job.
Alias of BQ_LIST_ROWS with maxResults default 100 for quick previews.
Starts an async query job (jobs.insert). Returns job id and status. Poll with BQ_GET_JOB and fetch rows with BQ_GET_QUERY_RESULTS.
Gets routine definition.
Do I need my own developer credentials to use BigQuery MCP with Adopt AI?
No, you can get started immediately using Adopt AI's built-in BigQuery integration. For production use, we recommend configuring your own service account credentials for greater control and security.
Can I connect BigQuery with other apps through Adopt AI?
Yes! Adopt AI supports multi-app workflows, so your AI agents can seamlessly move data between BigQuery and BI tools, data pipelines, cloud platforms, and more.
Is Adopt AI secure?
Absolutely. Adopt AI is SOC 2 Type 2 certified and ISO/IEC 27001 compliant, and adheres to EU GDPR, CCPA, and HIPAA standards. All data is encrypted in transit and at rest, ensuring the confidentiality, integrity, and availability of your data. Learn more here.
What happens if the BigQuery API changes?
Adopt AI maintains and updates all integrations automatically, so your agents always work with the latest API versions, no manual maintenance required.
Do I need coding skills to set up the BigQuery integration?
Not at all. Adopt AI's zero-shot API discovery means your agents understand BigQuery's schema on first contact. Setup takes minutes with no code required.
How do I set up custom BigQuery MCP in Adopt AI?
For a step-by-step guide on creating and configuring your own BigQuery service account credentials with Adopt AI, see here.