Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Storage Write API ValueError generator already executing

I'm writing to a Bigquery table as shown here.

 

try:
    insert_rows_write_api(dataset_id, HISTORY_WRITE_API_TABLE_ID, rows_to_insert)
except Exception as e:
    logger.warning("Error inserting history with storage write api: %s" % e)
    return JsonResponse(dict(success=True))

 

 insert_rows_write_api uses a `default stream` and writes to Bigquery via `AppendRowsStream`.

I'm seeing a high number of this error

ValueError: generator already executing
at
grpc/_channel.py in consume_request_iterator at line 273

grpc/_channel.py 

 

...
while True:
    return_from_user_request_generator_invoked = False
    try:
        # The thread may die in user-code. Do not block fork for this.
        cygrpc.enter_user_request_generator()
        request = next(request_iterator)
    except StopIteration:
        break
    except Exception:  # pylint: disable=broad-except
        cygrpc.return_from_user_request_generator()
        return_from_user_request_generator_invoked = True
...

 

 I'm also seeing these warning Logs which I think may be related to the error.

 

WARNING 12-01-2024 08:46:28 -> Error inserting history with storage write api: None There was a problem opening the stream. Try turning on DEBUG level logs to see the error.

and 

Error inserting history with storage write api: 400 Errors found while processing rows. Please refer to the row_errors field for details. The list may not be complete because of the size limitations. Entity: projects/musteri-hizmetleri/datasets/tiktak_alo_tech_com/tables/HISTORY_WRITE_API/_default

 

I'm not encountering the same error on local testing. What may be the reason? It seems like the table is not missing data.

How do I enable debug for only this warning? I don't think I can enable debug level logging for the whole project.

Solved Solved
1 1 927
1 ACCEPTED SOLUTION

Looking at the low level docs for the AppendRows API ... here, we seem to find that the API returns diagnostic information.  This makes me feel that the we should be looking at your code where the AppendRows API is being invoked and if an exception or bad response is returned, then we want to log the diagnostics in your own app.  I'd also suggest looking in the Cloud Error Logs to see if any errors are logged there.  My gut would say no as the expectation is that if you are appending rows, the notion is that your app trap errors appending your rows and handle appropriately in your logic.   Our overall first step is to gather as much diagnostics as possible.

View solution in original post

1 REPLY 1