Skip to content

Commit 46d9a77

Browse files
author
TF Object Detection Team
committed
Merge pull request #13618 from bhandarivijay-png:ai-gsutil-migration-1f4c60eec73c4db7a4ef14ed8519d40f
PiperOrigin-RevId: 884853554
2 parents dd7e694 + 4619290 commit 46d9a77

10 files changed

Lines changed: 23 additions & 23 deletions

File tree

official/legacy/bert/bert_cloud_tpu.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ gcloud config set project ${PROJECT_ID}
1414
```
1515
4. Create a Cloud Storage bucket using the following command:
1616
```
17-
gsutil mb -p ${PROJECT_ID} -c standard -l europe-west4 -b on gs://your-bucket-name
17+
gcloud storage buckets create gs://your-bucket-name --project ${PROJECT_ID} --default-storage-class standard --location europe-west4 --uniform-bucket-level-access
1818
```
1919
This Cloud Storage bucket stores the data you use to train your model and the training results.
2020
5. Launch a Compute Engine VM and Cloud TPU using the ctpu up command.
@@ -98,9 +98,9 @@ $ ctpu delete --zone=your-zone
9898
```
9999
$ ctpu status --zone=your-zone
100100
```
101-
4. Run gsutil as shown, replacing your-bucket with the name of the Cloud Storage bucket you created for this tutorial:
101+
4. Run gcloud storage as shown, replacing your-bucket with the name of the Cloud Storage bucket you created for this tutorial:
102102
```
103-
$ gsutil rm -r gs://your-bucket
103+
$ gcloud storage rm --recursive gs://your-bucket
104104
```
105105

106106

official/projects/edgetpu/vision/serving/inference_visualization_tool.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@
5757
" sandbox_path = web_path.split('/')[-1]\n",
5858
" !rm -f {sandbox_path}\n",
5959
" if web_path[:2] == \"gs\":\n",
60-
" !gsutil cp {web_path} {sandbox_path}\n",
60+
" !gcloud storage cp {web_path} {sandbox_path}\n",
6161
" else:\n",
6262
" !wget -v {web_path} --no-check-certificate\n",
6363
" return sandbox_path\n"
@@ -122,7 +122,7 @@
122122
},
123123
"source": [
124124
"MODEL_HOME='gs://tf_model_garden/models/edgetpu/checkpoint_and_tflite/vision/segmentation-edgetpu/tflite/default_argmax'\n",
125-
"!gsutil ls {MODEL_HOME}"
125+
"!gcloud storage ls {MODEL_HOME}"
126126
],
127127
"execution_count": null,
128128
"outputs": []

official/projects/longformer/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ pytorch longformer tokenized data to tf_records.
2121
Option 1. Use our saved checkpoint of `allenai/longformer-base-4096` stored in cloud storage
2222

2323
```bash
24-
gsutil cp -r gs://model-garden-ucsd-zihan/longformer-4096 .
24+
gcloud storage cp --recursive gs://model-garden-ucsd-zihan/longformer-4096 .
2525
```
2626
Option 2. Create it directly
2727

official/projects/waste_identification_ml/Deploy/detr_cloud_deployment/client/big_query_ops.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -131,7 +131,7 @@ def upload_image_results_to_storage_bucket(
131131
try:
132132
commands = [
133133
f"rm -r {os.path.basename(input_directory)}",
134-
f"gsutil -m cp -r {prediction_folder} {output_directory}",
134+
f"gcloud storage cp --recursive {prediction_folder} {output_directory}",
135135
f"rm -r {prediction_folder}",
136136
]
137137
subprocess.run(" && ".join(commands), shell=True, check=True)

official/projects/waste_identification_ml/Deploy/detr_cloud_deployment/client/big_query_ops_test.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -198,7 +198,7 @@ def test_upload_image_results_to_storage_bucket_success(self):
198198

199199
self.mock_subprocess_run.assert_called_once()
200200
args, _ = self.mock_subprocess_run.call_args
201-
self.assertIn(f"gsutil -m cp -r {pred_dir} {output_dir}", args[0])
201+
self.assertIn(f"gcloud storage cp --recursive {pred_dir} {output_dir}", args[0])
202202

203203
def test_upload_image_results_to_storage_bucket_failure(self):
204204
input_dir = "/tmp/input"

official/projects/waste_identification_ml/Deploy/detr_cloud_deployment/client/utils.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -94,7 +94,7 @@ def setup_logger_and_directories(input_dir):
9494
"""
9595

9696
input_directory = (input_dir).rstrip('/\\')
97-
command = f'gsutil -m cp -r {input_directory} .'
97+
command = f'gcloud storage cp --recursive {input_directory} .'
9898
subprocess.run(command, shell=True, check=True)
9999
prediction_folder = os.path.basename(input_directory) + '_prediction'
100100
os.makedirs(prediction_folder, exist_ok=True)

official/projects/waste_identification_ml/llm_applications/milk_pouch_detection/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -100,7 +100,7 @@ images to the source GCS bucket created by the script. The script output will
100100
provide the name of the bucket.
101101

102102
```bash
103-
gsutil cp your-local-image.jpg gs://<source-bucket-name>/
103+
gcloud storage cp your-local-image.jpg gs://<source-bucket-name>/
104104
```
105105
---
106106

official/projects/waste_identification_ml/llm_applications/milk_pouch_detection/src/run_pipeline.sh

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -29,14 +29,14 @@ cd milk_pouch_project
2929
# NOTE: Adjust the grep pattern if other image types are expected.
3030
echo "=== DEBUGGING START ==="
3131
echo "DEBUG: gcs_path variable is: '${gcs_path}'"
32-
echo "DEBUG: Running 'gsutil ls \"${gcs_path}\"' to check accessibility:"
33-
gsutil ls "${gcs_path}" || echo "❌ gsutil ls failed"
34-
echo "DEBUG: Running 'gsutil ls -r \"${gcs_path}\" | head -n 10' to check content:"
35-
gsutil ls -r "${gcs_path}" | head -n 10 || echo "❌ gsutil recursive ls failed"
32+
echo "DEBUG: Running 'gcloud storage ls "${gcs_path}"' to check accessibility:"
33+
gcloud storage ls "${gcs_path}" || echo "❌ gsutil ls failed"
34+
echo "DEBUG: Running 'gcloud storage ls --recursive "${gcs_path}" | head -n 10' to check content:"
35+
gcloud storage ls --recursive "${gcs_path}" | head -n 10 || echo "❌ gsutil recursive ls failed"
3636
echo "=== DEBUGGING END ==="
3737

3838
echo "🖨️ Listing image files from GCS bucket: $gcs_path"
39-
mapfile -t all_gcs_files < <(gsutil ls -r "${gcs_path}" | grep -iE '\.(png|jpg|jpeg)$' | grep -v "/predictions/" | grep -v "/processed/")
39+
mapfile -t all_gcs_files < <(gcloud storage ls --recursive "${gcs_path}" | grep -iE '\.(png|jpg|jpeg)$' | grep -v "/predictions/" | grep -v "/processed/")
4040
num_files=${#all_gcs_files[@]}
4141

4242
if (( num_files == 0 )); then
@@ -77,7 +77,7 @@ for (( i=0; i<num_files; i+=batch_size )); do
7777

7878
# Copy current batch files from GCS
7979
echo "🖨️ Copying $num_in_batch files from GCS to input_images/..."
80-
gsutil -m cp "${current_batch[@]}" input_images/
80+
gcloud storage cp "${current_batch[@]}" input_images/
8181

8282
# Extract objects
8383
echo "🔎 Extracting objects from images..."
@@ -96,7 +96,7 @@ for (( i=0; i<num_files; i+=batch_size )); do
9696
# Move predictions back to GCS
9797
if [ -d "predictions" ] && [ -n "$(find predictions -type f -print -quit)" ]; then
9898
echo "🖨️ Moving predictions for this batch back to GCS bucket: $gcs_path"
99-
gsutil -m cp -r predictions/ "$gcs_path"
99+
gcloud storage cp --recursive predictions/ "$gcs_path"
100100
else
101101
echo "⚠️ No predictions generated for this batch."
102102
fi
@@ -109,7 +109,7 @@ for (( i=0; i<num_files; i+=batch_size )); do
109109

110110
target_root="${clean_gcs_path}processed/"
111111

112-
# Group files by their destination directory to optimize gsutil calls
112+
# Group files by their destination directory to optimize gcloud storage calls
113113
declare -a current_move_batch
114114
current_move_dir=""
115115

@@ -128,7 +128,7 @@ for (( i=0; i<num_files; i+=batch_size )); do
128128
# If the destination directory changes, flush the current batch
129129
if [[ "$dest_dir" != "$current_move_dir" ]]; then
130130
if (( ${#current_move_batch[@]} > 0 )); then
131-
gsutil -m mv "${current_move_batch[@]}" "$current_move_dir"
131+
gcloud storage mv "${current_move_batch[@]}" "$current_move_dir"
132132
current_move_batch=()
133133
fi
134134
current_move_dir="$dest_dir"
@@ -138,7 +138,7 @@ for (( i=0; i<num_files; i+=batch_size )); do
138138

139139
# Flush any remaining files
140140
if (( ${#current_move_batch[@]} > 0 )); then
141-
gsutil -m mv "${current_move_batch[@]}" "$current_move_dir"
141+
gcloud storage mv "${current_move_batch[@]}" "$current_move_dir"
142142
fi
143143

144144
unset current_move_batch

research/object_detection/dockerfiles/tf2_ai_platform/Dockerfile

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ RUN apt-get update && apt-get install -y \
1414
python3-opencv \
1515
wget
1616

17-
# Installs google cloud sdk, this is mostly for using gsutil to export model.
17+
# Installs google cloud sdk, this is mostly for using gcloud storage to export model.
1818
RUN wget -nv \
1919
https://dl.google.com/dl/cloudsdk/release/google-cloud-sdk.tar.gz && \
2020
mkdir /root/tools && \
@@ -29,7 +29,7 @@ RUN wget -nv \
2929

3030
# Path configuration
3131
ENV PATH $PATH:/root/tools/google-cloud-sdk/bin
32-
# Make sure gsutil will use the default service account
32+
# Make sure gcloud storage will use the default service account
3333
RUN echo '[GoogleCompute]\nservice_account = default' > /etc/boto.cfg
3434

3535
WORKDIR /home/tensorflow

research/object_detection/g3doc/oid_inference_and_evaluation.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ access to the cloud bucket with the images. Then run:
5858
# From tensorflow/models/research/oid
5959
SPLIT=validation # Set SPLIT to "test" to download the images in the test set
6060
mkdir raw_images_${SPLIT}
61-
gsutil -m rsync -r gs://open-images-dataset/$SPLIT raw_images_${SPLIT}
61+
gcloud storage rsync --recursive gs://open-images-dataset/$SPLIT raw_images_${SPLIT}
6262
```
6363

6464
Another option for downloading the images is to follow the URLs contained in the

0 commit comments

Comments
 (0)