Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Use Pub/Sub as a Datastream destination

I was reading over some of the Datastream docs on creating streams and it mentions the following 

However, you'll soon be able to create connection profiles for other destinations, including BigQuery and Pub/Sub.

Is there a timeline on when Pub/Sub will be available as a destination? Secondly, do you know if there are any plans on supporting Pub/Sub message attributes when Pub/Sub does become available as a destination for Datastream? Thanks!

Solved Solved
1 12 4,682
1 ACCEPTED SOLUTION

We're planning to make Pub/Sub destination available, but I can't confirm specific features or timelines at this time. 

View solution in original post

12 REPLIES 12

We're planning to make Pub/Sub destination available, but I can't confirm specific features or timelines at this time. 

What is the recommended work-around for now? Would it be to write data to Cloud Storage, and use a pub/sub subscription on Cloud Storage (https://cloud.google.com/storage/docs/pubsub-notifications)? That seems quite wasteful of resources.

Thanks!

 

Hi, We are looking forward to the support of Pub/Sub as a Datastream destination. Until then, the following article describes a good workaround.

https://cloud.google.com/datastream/docs/implementing-datastream-dataflow-analytics

@etaim hi! any updates on the timeline?

no updates at this time. please feel free to reach out to your account team, or create an issue in our [public issue tracker](https://issuetracker.google.com/issues/new?component=1150595&template=0) to receive ongoing updates as they become available.

https://issuetracker.google.com/issues/278842503

for anyone who is likewise interested

Hi , Is pub sub available as destination for datastream in preview? if It is available then can you please provide me official documentation for that.

It's not available yet, and no ETA that we can share at this time.

Okay,thanks for the info We are looking forward to the support of Pub/Sub as a Datastream destination.

I have a question regarding Datastream configuration. I've set up Datastream to write data to a GCS destination, which triggers a Cloud Function.

According to the documentation on Events and Streams, it states:

  • Update without Primary Key Changes: The change_type will be UPDATE.
  • Update with Primary Key Changes: The change_type will be UPDATE-INSERT.

However, in my case, I'm receiving Source-specific metadata where change_type is 'UPDATE-INSERT', even though the primary keys ('account_id' and 'id') are not changing.

For instance, the response I'm seeing includes 'primary_keys': ['account_id', 'id'] and 'change_type': 'UPDATE-INSERT'. Upon verifying the data flow in my table, these keys ('account_id', 'id') remain unchanged. Shouldn't the change_type in this scenario be 'UPDATE' instead?

Could anyone advise on how to address this issue?

is there any alternate or workaround to get older value for updating field without primary key?

@etaimany news on this? Pub/Sub as a datastream destination would be a great feature to have