Hi Team,
we want to insert the data into GCP bigquery with Storage API, we are able to achieve to insert data through batches.
But we don't want to send duplicate records into big query. also, we don't have requirement to use unique key record in data. Please let us know how we can achieve.
Thank you.
This article looks like it may be a good starting point.
Hi Kolban, Thanks for your reply.
In the above link it says to use kafka stream and offset. but we want in batches to send the data, please help us on with batches storage api.