Skip to content

Commit 672b57f

Browse files
google-genai-botcopybara-github
authored andcommitted
chore: add a sample BigQuery agent using BigQuery MCP tools
PiperOrigin-RevId: 856400285
1 parent 38d52b2 commit 672b57f

File tree

5 files changed

+126
-7
lines changed

5 files changed

+126
-7
lines changed

contributing/samples/bigquery/README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -24,11 +24,11 @@ distributed via the `google.adk.tools.bigquery` module. These tools include:
2424
5. `get_job_info`
2525
Fetches metadata about a BigQuery job.
2626

27-
5. `execute_sql`
27+
6. `execute_sql`
2828

2929
Runs or dry-runs a SQL query in BigQuery.
3030

31-
6. `ask_data_insights`
31+
7. `ask_data_insights`
3232

3333
Natural language-in, natural language-out tool that answers questions
3434
about structured data in BigQuery. Provides a one-stop solution for generating
@@ -38,18 +38,18 @@ distributed via the `google.adk.tools.bigquery` module. These tools include:
3838
the official [Conversational Analytics API documentation](https://cloud.google.com/gemini/docs/conversational-analytics-api/overview)
3939
for instructions.
4040

41-
7. `forecast`
41+
8. `forecast`
4242

4343
Perform time series forecasting using BigQuery's `AI.FORECAST` function,
4444
leveraging the TimesFM 2.0 model.
4545

46-
8. `analyze_contribution`
46+
9. `analyze_contribution`
4747

4848
Perform contribution analysis in BigQuery by creating a temporary
4949
`CONTRIBUTION_ANALYSIS` model and then querying it with
5050
`ML.GET_INSIGHTS` to find top contributors for a given metric.
5151

52-
9. `detect_anomalies`
52+
10. `detect_anomalies`
5353

5454
Perform time series anomaly detection in BigQuery by creating a temporary
5555
`ARIMA_PLUS` model and then querying it with
Lines changed: 55 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,55 @@
1+
# BigQuery MCP Toolset Sample
2+
3+
## Introduction
4+
5+
This sample agent demonstrates using ADK's `McpToolset` to interact with
6+
BigQuery's official MCP endpoint, allowing an agent to access and execute
7+
toole by leveraging the Model Context Protocol (MCP). These tools include:
8+
9+
10+
1. `list_dataset_ids`
11+
12+
Fetches BigQuery dataset ids present in a GCP project.
13+
14+
2. `get_dataset_info`
15+
16+
Fetches metadata about a BigQuery dataset.
17+
18+
3. `list_table_ids`
19+
20+
Fetches table ids present in a BigQuery dataset.
21+
22+
4. `get_table_info`
23+
24+
Fetches metadata about a BigQuery table.
25+
26+
5. `execute_sql`
27+
28+
Runs or dry-runs a SQL query in BigQuery.
29+
30+
## How to use
31+
32+
Set up your project and local authentication by following the guide
33+
[Use the BigQuery remote MCP server](https://docs.cloud.google.com/bigquery/docs/use-bigquery-mcp).
34+
This agent uses Application Default Credentials (ADC) to authenticate with the
35+
BigQuery MCP endpoint.
36+
37+
Set up environment variables in your `.env` file for using
38+
[Google AI Studio](https://google.github.io/adk-docs/get-started/quickstart/#gemini---google-ai-studio)
39+
or
40+
[Google Cloud Vertex AI](https://google.github.io/adk-docs/get-started/quickstart/#gemini---google-cloud-vertex-ai)
41+
for the LLM service for your agent. For example, for using Google AI Studio you
42+
would set:
43+
44+
* GOOGLE_GENAI_USE_VERTEXAI=FALSE
45+
* GOOGLE_API_KEY={your api key}
46+
47+
Then run the agent using `adk run .` or `adk web .` in this directory.
48+
49+
## Sample prompts
50+
51+
* which weather datasets exist in bigquery public data?
52+
* tell me more about noaa_lightning
53+
* which tables exist in the ml_datasets dataset?
54+
* show more details about the penguins table
55+
* compute penguins population per island.
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# Copyright 2025 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
15+
from . import agent
Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
# Copyright 2025 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
15+
from google.adk.agents.llm_agent import LlmAgent
16+
from google.adk.tools.mcp_tool.mcp_session_manager import StreamableHTTPConnectionParams
17+
from google.adk.tools.mcp_tool.mcp_toolset import McpToolset
18+
import google.auth
19+
20+
BIGQUERY_AGENT_NAME = "adk_sample_bigquery_mcp_agent"
21+
BIGQUERY_MCP_ENDPOINT = "https://bigquery.googleapis.com/mcp"
22+
BIGQUERY_SCOPE = "https://www.googleapis.com/auth/bigquery"
23+
24+
# Initialize the tools to use the application default credentials.
25+
# https://cloud.google.com/docs/authentication/provide-credentials-adc
26+
credentials, project_id = google.auth.default(scopes=[BIGQUERY_SCOPE])
27+
credentials.refresh(google.auth.transport.requests.Request())
28+
oauth_token = credentials.token
29+
30+
bigquery_mcp_toolset = McpToolset(
31+
connection_params=StreamableHTTPConnectionParams(
32+
url=BIGQUERY_MCP_ENDPOINT,
33+
headers={"Authorization": f"Bearer {oauth_token}"},
34+
)
35+
)
36+
37+
# The variable name `root_agent` determines what your root agent is for the
38+
# debug CLI
39+
root_agent = LlmAgent(
40+
model="gemini-2.5-flash",
41+
name=BIGQUERY_AGENT_NAME,
42+
description=(
43+
"Agent to answer questions about BigQuery data and models and execute"
44+
" SQL queries using MCP."
45+
),
46+
instruction="""\
47+
You are a data science agent with access to several BigQuery tools provided via MCP.
48+
Make use of those tools to answer the user's questions.
49+
""",
50+
tools=[bigquery_mcp_toolset],
51+
)

src/google/adk/cli/cli_tools_click.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1291,7 +1291,6 @@ async def _lifespan(app: FastAPI):
12911291
host=host,
12921292
port=port,
12931293
reload=reload,
1294-
log_level=log_level.lower(),
12951294
)
12961295

12971296
server = uvicorn.Server(config)
@@ -1368,7 +1367,6 @@ def cli_api_server(
13681367
host=host,
13691368
port=port,
13701369
reload=reload,
1371-
log_level=log_level.lower(),
13721370
)
13731371
server = uvicorn.Server(config)
13741372
server.run()

0 commit comments

Comments
 (0)