Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Cloud Logging list_entries is slow every 50 iteration through the generator

Hi,

I am running queries against a custom Cloud Logging bucket I created.

There are about 800 entries in this log bucket.

The queries run in a reasonable time ~1.2s. I have followed the guidance here: https://cloud.google.com/logging/docs/view/logging-query-language#optimize-queries, and do not see a difference in iterating through the generator.
My timing data shows that every 50th iteration through the generator returned by `list_entries` is slow - 300ms to 4,000ms.

I assume the page size is 50. Is this expected?

My code:

 

 

import google.cloud.logging_v2 as gc_logging
...

def query_bucker(query):
    total_time_start = datetime.datetime.now()
    result_generator = gc_logging.Client(project=project_id).list_entries(
        resource_names=[f"projects/{project_id}"],
        filter_=query)
    query_time = int((datetime.datetime.now() - total_time_start).total_seconds()*1000)

    per_result_time = result_time_start = datetime.datetime.now()
    results = []
    ms = []
    for i, result in enumerate(result_generator):
        if isinstance(result.payload, dict):
            result.payload["timestamp"] = result.timestamp
            results.append(result.payload)
        elapsed = int((datetime.datetime.now() - per_result_time).total_seconds()*1000)
        if elapsed != 0:
            ms.append({"n": i, "ms": elapsed})
        per_result_time = datetime.datetime.now()
    
    result_time = int((datetime.datetime.now() - result_time_start).total_seconds()*1000)
    total_time = int((datetime.datetime.now() - total_time_start).total_seconds()*1000)
    ...

 

 

Thank you

-Darren

 

1 2 1,105