Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

How can I get logs from the command line for a batch job?

I know how to open the logs in the logs explorer, but I'd like to use the CLI command `gcloud logging read`.

However, this command requires specifying both the "--view" and "--bucket" flag. How do I find the value of those arguments for a given job? It looks like the logs explorer scopes logs by project instead of storage.

Alternatively, is there a way for me to not have to specify the "--view" and "--bucket" flags?

Solved Solved
2 2 776
1 ACCEPTED SOLUTION

Hi vedantroy-genmo, to view the logs generated by a Batch job, "--view" and "--bucket" are not required. You need to get the jobUID[1], the uid is generated after you created a job. Then you can filter logs by labels.job_uid, reference: [2]. e.g.:

gcloud logging read labels.job_uid=<your_job_uid>

[1] 

View solution in original post

2 REPLIES 2

Hi vedantroy-genmo, to view the logs generated by a Batch job, "--view" and "--bucket" are not required. You need to get the jobUID[1], the uid is generated after you created a job. Then you can filter logs by labels.job_uid, reference: [2]. e.g.:

gcloud logging read labels.job_uid=<your_job_uid>

[1] 

Thanks, this worked. Is there anyway to select a specific field from the logs (to avoid large egress fees when analyzing?)

Using `--format=json(textPayload)` works, but not sure if that is just doing the filtering on the client side (and thus still incurring large egress).