I am coding in Python a Cloud Function that writes data to BigQuery. To bypass quota limits, I am trying to use the BigQuery Storage Write API to write my data, following this documentation. The destination BigQuery table has Datetime columns but it seems to not be supported by protobuf so I use timestamps in my stream instead.
I am using cloud logging on Debug on my Cloud Function
logging_client = cloud_logging.Client() logging_client.get_default_handler() logging_client.setup_logging(log_level=logging.DEBUG)
When launching my Cloud Function, I get the following error in the function logs:
google.api_core.exceptions.Unknown: None There was a problem opening the stream. Try turning on DEBUG level logs to see the error.
How can I get the logs from the BigQuery Storage Write API on a Cloud Function? Moreover, can Bigquery automatically convert timestamps to datetime during the insert?
Solved! Go to Solution.
The error message you're encountering, "google.api_core.exceptions.Unknown: None There was a problem opening the stream. Try turning on DEBUG level logs to see the error," is indicating an issue when trying to open a data stream for writing to BigQuery using the BigQuery Storage Write API.
It seems that you already have DEBUG-level logging. You can now redeploy your Cloud Function and reproduce the error. Review the logs generated with DEBUG-level information. These logs should provide more detailed information about the problem, including the underlying error or exception that caused the issue. You can troubleshoot and address the specific problem that is preventing the successful opening of the stream for writing data to BigQuery using the BigQuery Storage Write API.
As for your second question, when writing data to BigQuery using the BigQuery Storage Write API, it's important to note that BigQuery expects timestamp values in a specific format. It doesn't automatically convert timestamps to datetime during insertion. You need to make sure that the timestamp values in your data are in the correct format, which is typically in the form of a string representing a timestamp in the ISO 8601 format (e.g., "2023-11-07T10:00:00Z"). If your data contains timestamps in a different format, you may need to transform them before writing the data to BigQuery.
The error message you're encountering, "google.api_core.exceptions.Unknown: None There was a problem opening the stream. Try turning on DEBUG level logs to see the error," is indicating an issue when trying to open a data stream for writing to BigQuery using the BigQuery Storage Write API.
It seems that you already have DEBUG-level logging. You can now redeploy your Cloud Function and reproduce the error. Review the logs generated with DEBUG-level information. These logs should provide more detailed information about the problem, including the underlying error or exception that caused the issue. You can troubleshoot and address the specific problem that is preventing the successful opening of the stream for writing data to BigQuery using the BigQuery Storage Write API.
As for your second question, when writing data to BigQuery using the BigQuery Storage Write API, it's important to note that BigQuery expects timestamp values in a specific format. It doesn't automatically convert timestamps to datetime during insertion. You need to make sure that the timestamp values in your data are in the correct format, which is typically in the form of a string representing a timestamp in the ISO 8601 format (e.g., "2023-11-07T10:00:00Z"). If your data contains timestamps in a different format, you may need to transform them before writing the data to BigQuery.