Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Log entry size exceeds the maximum size of 256

In one of the GCP logs we are facing the below error logs.

Issue: Failed to process request with tag XXX_stdout: rpc error: code = InvalidArgument desc = Log entry with size 272.5K exceeds maximum size of 256.0K

It was happening in below resources:

container_namefluentbit-gke
namespace_name: kube-system
resource type : k8s_container

Could you please help us to know the cause of issue and the procedure to resolve the issue.

 

 

0 1 440
1 REPLY 1

Hi @boopatma There is a Google Cloud Logging limit on the size per log entry.

The GKE logging pipeline (more specifically the fluentbit-gke container) performs a check on the size of a log line, and if it considers a log as oversize, it will truncate the log if it's in text format, or directly drop the log if it's in JSON.

However the size of log entry is usually larger than the raw log line emitted from workloads, because a log entry includes metadata (like pod and container names, labels, timestamps, etc.), so sometimes the size is still over limit after the truncation. Therefore it got rejected by cloud logging with an error reported (like the one you encountered).

To bring the log size below the limit, you can:

  1. try examine the container logs in your cluster and locate  which workload is emitting large sized logs.
  2. Evaluate if the workload can be modified to print shorter logs.
Top Labels in this Space