mini lab : BigQuery : 5 -Solution code

Please find below working code for this lab. I hope this information will help learners to gain knowledge. All the best.

Even though below code can be executed once, I request to run one by one to see the output and learn.

=======================================================

#Set PROJECT_ID and REGION
export PROJECT_ID=$(gcloud config get-value project)
export REGION=$(gcloud compute project-info describe --format="value(commonInstanceMetadata.items[google-compute-default-region])")

#export bucket name to BUCKET_NAME
BUCKET_NAME=${PROJECT_ID}-bucket

#Load the schema from the local CSV file customers.csv into the customers table.
bq load --autodetect --source_format=CSV customer_details.customers ./customers.csv

#Load the schema from local 'customers.csv' file into the customers table.
bq query --nouse_legacy_sql 'CREATE TABLE customer_details.male_customers AS SELECT CustomerID, gender FROM customer_details.customers WHERE gender = "Male"'

#List customer_details.male_customers table
bq query --use_legacy_sql=false 'SELECT * FROM `customer_details.male_customers` LIMIT 1000'

#Export the newly created male_customers table to an existing Google Cloud Storage bucket in CSV format as exported_male_customers.csv file.
bq extract --destination_format CSV customer_details.male_customers gs://$BUCKET_NAME/exported_male_customers.csv

#list Bucket files after uploading exported_male_customers.csv
gsutil ls gs://$BUCKET_NAME/

================================================================

0 1 126
1 REPLY 1

That is brilliant thanks for the instructions, I was stuck on the last piece, have you been able to complete mini lab : BigQuery : 6, I am stuck on that one too

Top Labels in this Space