This document provides a comprehensive guide to set up and run the project.
Before you begin, ensure you have the following tools installed:
- Python 3.12+
- Google Cloud SDK to manage your Google Cloud resources.
Clone the Repository:
git clone https://github.com/NucleusEngineering/A2A_BigQuery_Agent.git
cd A2A_BigQuery_Agent/kitchen_agentTo start testing the agent this project requires specific GCP infrastructure setup as well as correctly set environment variables.
-
Configure Environment Variables:
- Copy
dotenvto.env:cp dotenv .env
- Edit the newly created
.envfile and fill in yourGOOGLE_CLOUD_PROJECTandGOOGLE_CLOUD_LOCATION. For location you can use any, for exampleus-central1. The other variables should stay unchanged.
- Copy
-
Authenticate Google Cloud CLI:
- Log in to your Google Cloud account:
gcloud auth application-default login
- Set the active Google Cloud project:
gcloud config set project <your-gcp-project-id>
- Log in to your Google Cloud account:
-
Create and fill a BigQuery table:
- This command will create a BigQuery datasest & table in your set project & location.
- Afterwards this table will be filled with the necessary data for the agent.
- Run this command:
bq --location=$GOOGLE_CLOUD_LOCATION mk --dataset $GOOGLE_CLOUD_PROJECT:kitchen_inventory bq mk --location=$GOOGLE_CLOUD_LOCATION --table $GOOGLE_CLOUD_PROJECT:kitchen_inventory.fruits name:STRING,quantity:INTEGER
- Run this command to populate the table (make sure you are in the
kitchen_inventorydirectory):bq load --source_format=CSV \ $GOOGLE_CLOUD_PROJECT:kitchen_inventory.fruits \ ./fruits_data.csv \ name:STRING,quantity:INTEGER
-
Create virtual environment & necessary requirements download:
- Navigate back to the root directory (A2A_BigQuery_Agent):
cd .. - Create a virtual environment & activate it:
uv venv --python 3.12 source .venv/bin/activate - Install the necessary libraries:
pip install -r requirements.txt
- Navigate back to the root directory (A2A_BigQuery_Agent):
Go to TUTORIAL.md to run the agent.