I know how to open the logs in the logs explorer, but I'd like to use the CLI command `gcloud logging read`.
However, this command requires specifying both the "--view" and "--bucket" flag. How do I find the value of those arguments for a given job? It looks like the logs explorer scopes logs by project instead of storage.
Alternatively, is there a way for me to not have to specify the "--view" and "--bucket" flags?
Solved! Go to Solution.
Hi vedantroy-genmo, to view the logs generated by a Batch job, "--view" and "--bucket" are not required. You need to get the jobUID[1], the uid is generated after you created a job. Then you can filter logs by labels.job_uid, reference: [2]. e.g.:
gcloud logging read labels.job_uid=<your_job_uid>
[1]
Hi vedantroy-genmo, to view the logs generated by a Batch job, "--view" and "--bucket" are not required. You need to get the jobUID[1], the uid is generated after you created a job. Then you can filter logs by labels.job_uid, reference: [2]. e.g.:
gcloud logging read labels.job_uid=<your_job_uid>
[1]
Thanks, this worked. Is there anyway to select a specific field from the logs (to avoid large egress fees when analyzing?)
Using `--format=json(textPayload)` works, but not sure if that is just doing the filtering on the client side (and thus still incurring large egress).