Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Streaming Dataflows has exponential retry functionality, can we tailor the no. of retries?

Dear Google Cloud Support,

I'm utilizing Streaming Dataflow and appreciate the built-in exponential retry functionality. However, my current use case requires more granular control over retries.Is it possible to customize the number of retries on a per-job or even per-error basis? I've explored workarounds like custom error handling, but the approach I have tried is not ideal.Please advise if there are options for tailoring the no of entries for Streaming Dataflow's retry behavior.

Thank you for your time and consideration.

2 1 739
1 REPLY 1

Note: The Dataflow service retries failed tasks up to 4 times in batch mode, and an unlimited number of times in streaming mode. In batch mode, your job fails; in streaming mode, it might stall indefinitely.

I see this related note for your use case, It might be a good idea to file this as Feature Request for the product: 

https://cloud.google.com/support/docs/issue-trackers