Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Datastream to Mysql SQL private IP not working over private connectivity!

Hi! I'm having some issues with the same topic.

I'm using terraform to create a datastream with a private mysql and there is no way to get the source connection to work.

The error is that

 

 

google_datastream_connection_profile.source_connection_profile_private: Creating...
google_datastream_connection_profile.source_connection_profile_private: Still creating... [10s elapsed]
google_datastream_connection_profile.source_connection_profile_private: Still creating... [20s elapsed]
╷
│ Error: Error waiting to create ConnectionProfile: Error waiting for Creating ConnectionProfile: {"@type":"type.googleapis.com/google.rpc.ErrorInfo","domain":"datastream.googleapis.com","metadata":{"message":"We timed out trying to connect to the data source. Make sure that the hostname and port configuration is correct, and that the data source is available.","originalMessage":"(2003, \"Can't connect to MySQL server on '10.4.240.3' (timed out)\")","time":"2023-07-26T13:21:39.083515Z","uuid":"cea16064-64bc-47ed-aed9-3bf7a816202b"},"reason":"CONNECTION_TIMEOUT"}
│ {"code":"VALIDATE_CONNECTIVITY","description":"Validates that Datastream can connect to the source database.","message":[{"code":"CONNECTION_TIMEOUT","level":"ERROR","message":"We timed out trying to connect to the data source. Make sure that the hostname and port configuration is correct, and that the data source is available.","metadata":{"original_error":"(2003, \"Can't connect to MySQL server on '10.4.240.3' (timed out)\")"}}],"state":"FAILED"}
│
│

 

 


The code I'm using is like that but the
google_datastream_connection_profile

there is no way to avoid error

 

 

resource "google_compute_firewall" "default" {
project = local.project_id
name = "datastream-proxy-access"
network = google_compute_network.vpc-datastream.id

allow {
protocol = "tcp"
ports = ["3306"]
}

source_ranges = [google_datastream_private_connection.default.vpc_peering_config.0.subnet]
}

resource "google_datastream_private_connection" "default" {
project = local.project_id
display_name = "Connection profile Private"
location = local.region
private_connection_id = "my-private-connection"

vpc_peering_config {
vpc = google_compute_network.vpc-datastream.id
subnet = "10.0.0.0/29" # A free subnet for peering. (CIDR of /29)
}
}

resource "google_datastream_connection_profile" "source_connection_profile_private" {
project = local.project_id
display_name = "Source connection profile private"
location = local.region
connection_profile_id = "source-profile-private"

mysql_profile {
hostname = "10.4.240.3"
username = "datastream"
password = "xxxxxx"
}
}

 

 


I don't understand why it is not able to reach the DB, I've tried to create a private VM instance and it reaches the DB perfectly but the datastream doesn't.

In some sites I have seen some example that a cloudsqlproxy is needed? this is so? but I don't understand how it intervenes to have a VM with cloudsqlproxy, I also tried to create it but it did not change anything.

 

 

# VM WITH CLOUD SQL PROXY
resource "google_compute_instance" "vm_cloud_sql_proxy" {
project = local.project_id
name = "datastream-proxy"
machine_type = "e2-micro"
zone = local.zone

tags = ["ssh"]

boot_disk {
initialize_params {
image = "debian-cloud/debian-11"
}
}

network_interface {
network = google_compute_network.vpc-datastream.id
subnetwork = google_compute_subnetwork.subnet-datastream.id
}

service_account {
# Google recommends custom service accounts that have cloud-platform scope and permissions granted via IAM Roles.
email = google_service_account.default.email
scopes = ["cloud-platform"]
}

metadata_startup_script = <<EOT
#!/bin/sh
apt-get update
sudo apt-get install wget
wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
chmod +x cloud_sql_proxy
./cloud_sql_proxy -instances=xxxxxxx:europe-west1:instance-private-stage-master=tcp:0.0.0.0:3306
EOT

}

 

 

 

Any idea what is missing? I'm beginning to think it can't be done

Thank you very much

0 14 2,787
14 REPLIES 14

The error message suggests that the hostname and port configuration might be incorrect, or the data source might be unavailable.

Here are a few things you could check:

  1. MySQL Server Accessibility: Ensure that your MySQL server is running and accessible from the network where your Datastream service is running. You mentioned that a VM instance can reach the DB, but it's important to confirm that this VM is in the same network as your Datastream service.

  2. Firewall Rules: Check your firewall rules to ensure that the Datastream service is allowed to connect to your MySQL server. The firewall rule you've defined allows TCP traffic on port 3306, which is the default port for MySQL. However, make sure that this rule is correctly associated with the network and the MySQL server.

  3. Private Connection Configuration: In your google_datastream_private_connection resource, you've specified a subnet of "10.0.0.0/29" for VPC peering. Ensure that this subnet is correctly configured and does not overlap with other subnets in your VPC network.

  4. MySQL Server Configuration: The MySQL server should be configured to accept connections from the IP range of the Datastream service. This might require modifying the bind-address configuration in your MySQL server's configuration file.

Regarding the use of Cloud SQL Proxy, it's typically used to connect to Google Cloud SQL instances from applications running outside of Google Cloud. If your MySQL server is a Cloud SQL instance, you might need to use the Cloud SQL Proxy to establish a connection.

If you are using a Cloud SQL proxy, make sure that the proxy is running and that the configuration file is correct. The configuration file for the Cloud SQL proxy should contain the following information:


# The Cloud SQL instance name.
instance_name = xxxxxxxxxx:europe-west1:instance-private-stage-master

# The port that the Cloud SQL proxy should listen on.
port = 3306

# The IP address that the Cloud SQL proxy should listen on.
address = 0.0.0.0

I have doubts about the 4 points you mention.

1) In my VPC I have only one subnet 192.168.26.0/28, the other subnet that the datastream creates 10.0.0.0.0/29 is not created in the VPC (I think that datastream creates it automatically or this is not the case and I need to create the subnet at VPC level as well?)

2) The firewall rule I'm convinced is correct, I'm setting the datastream network to be able to access port 3306.

3) This is what I commented in point 1, I need to create that network before? it doesn't create it in that terraform resource of datastream?

4) My DB is a private cloud sql (mysql) the IP is correct in the recipe, therefore it doesn't have public access and I shouldn't add the datastream service range?

How the datastream connects to my private DB is what I'm not clear about.
In my recipe I tried to raise a VM with cloudsqlproxy but I don't understand very well the reason for this if I'm creating the peering, should I raise this VM in the same network as the datastream?

Thank you very much

Dani3_0-1690447442500.png

Dani3_1-1690447474488.png

The peering is create correctly, but the source connection with mysql is not able to connect.

Hey @Dani3 , I can confirm you definitely need a reverse proxy in the VPC. It is because the peering routes are not transitive, both cloudsql and datastream have routes in to your vpc, but they do not have routes to each other.

I did struggle to find the documentation while implementing, but eventually found google's proper guidance here https://cloud.google.com/datastream/docs/private-connectivity

Thanks for the additional information. Let's clarify your doubts:

  1. Subnet: The subnet "10.0.0.0/29" you specified in the google_datastream_private_connection resource is used for VPC peering with Datastream. You don't need to create this subnet manually in your VPC. Datastream will create a peering connection using this subnet. Make sure this subnet does not overlap with any existing subnets in your VPC.

  2. Firewall Rule: Your firewall rule seems correct. It allows the Datastream service to connect to your MySQL server on port 3306.

  3. Private Connection Configuration: As mentioned in point 1, you don't need to create the "10.0.0.0/29" subnet manually. Datastream will handle this as part of the VPC peering process.

  4. MySQL Server Configuration: Since your MySQL server is a private Cloud SQL instance, you don't need to add the Datastream service range to the MySQL server's allowed IP ranges. The connection should be allowed through the VPC peering connection.

Regarding the use of Cloud SQL Proxy, it's typically used to provide secure access to your Cloud SQL instance from applications running outside of Google Cloud. In your case, since you're trying to connect from Datastream (which is a Google Cloud service) to a Cloud SQL instance, you shouldn't need to use the Cloud SQL Proxy.

The connection from Datastream to your private Cloud SQL instance should be established through the VPC peering connection. The Datastream service connects to the Cloud SQL instance using the private IP address of the instance, which should be accessible within the same VPC network.

If you're still having issues, it might be helpful to check the logs for the Datastream service and the Cloud SQL instance. These logs might provide additional details about the connection attempts and any errors that are occurring.

Hi, I got it to work, indeed I needed the VM with the auth proxy and in the datastream credentials to indicate the IP of the VM.

What I haven't seen much documentation on how to do the import from dataflow to another DB also private.

If the target DB is private I will have the same problems? how I will be able to connect the dataflow job with the target DB instance?

The documentation I found is that https://cloud.google.com/dataflow/docs/guides/templates/provided/datastream-to-sql

And I have doubts if with that template I can specify the networks, since what I tried tells me that it does not find the network.

Failed to start the VM, launcher-2023072801385114654941323427813756, used for launching because of status code: INVALID_ARGUMENT, reason: Invalid Error: Message: Invalid value for field 'resource.networkInterfaces[0].network': 'global/networks/default'. The referenced network resource cannot be found. HTTP Code: 400.

Thank you very much

Hi!,

I was able to solve it by creating the VM machine auth proxy and putting the IP of this new machine as host of the datastream DB connection 🙂

But now I would like to implement the other part, the one that the datastream reads from the bucket and updates another target DB. The problem again is that this other cloud sql instance is private and I didn't find a lot of documentation

Do you know if dataflow also needs auth proxy to be able to connect to the target DB? how I can manage that?

I've only found this information https://cloud.google.com/dataflow/docs/guides/templates/provided/datastream-to-sql

 

 

gcloud dataflow flex-template run test4 \
--project=fb-ops-datastream-xxxxxx \
--region=europe-west1 \
--enable-streaming-engine \
--network=vpc-datastream-stage \
--subnetwork=subnet-datastream-stage \
--template-file-gcs-location=gs://dataflow-templates-europe-west1/latest/flex/Cloud_Datastream_to_SQL \
--parameters \
inputFilePattern=gs://my-bucket-xxxxx-v3/datastream/live-data,\
databaseHost=xxxxx,\
databaseUser=datastream,\
databasePassword=datastream

Failed to start the VM, launcher-2023072801582114519066258001821280, used for launching because of status code: INVALID_ARGUMENT, reason: Invalid Error: Message: Invalid value for field 'resource.networkInterfaces[0].subnetwork': 'subnet-datastream-stage'. The URL is malformed. HTTP Code: 400.

Thank you very much

Dataflow can connect to a private Cloud SQL instance using Serverless VPC Access. You would need to set up a Serverless VPC Access connector in the same project and region as your Dataflow job, and specify this connector when you run your Dataflow job.

The error message you're seeing indicates that the 'subnetwork' parameter in your command should be a fully qualified URL of the subnetwork, not just the subnetwork name. The URL should be in the following format:

https://www.googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK

Hi,

I've created the Serverless VPC Access but I still find myself with many doubts that I do not know how to solve and some problems with the templates.

To make it simple I'm using a public DB as target (later I will change it for the private DB), here two things:

1) If I run the gcloud command it gives me no problems in the sense that the job starts to create and then fails with this error

 

gcloud dataflow flex-template run test8 \
    --project=xx-ops-datastream-xxxxx \
    --region=europe-west1 \
    --enable-streaming-engine \
    --network=projects/xx-ops-datastream-xxxxx/global/networks/vpc-datastream-stage \
    --subnetwork=projects/xx-ops-datastream-xxxxx/regions/europe-west1/subnetworks/subnet-datastream-stage \
    --template-file-gcs-location=gs://dataflow-templates-europe-west1/latest/flex/Cloud_Datastream_to_SQL \
    --parameters \
inputFilePattern=gs://my-bucket-test-datastream-xxx/datastream/live-data,\
databaseHost=35.189.xxx.xxx,\
databaseUser=datastream,\
databasePassword=xxxx

 

Error with gcloud

Dani3_0-1690967414468.png

 

2) But if I run the terraform with the same address of the templates it is not able to find it. Could it be a bug or I'm not doing it right?

 

 

resource "google_dataflow_job" "dataflow_job" {
  project   = local.project_id
  region    = local.region
  zone      = local.zone
  name      = "dataflow-test"
  on_delete = "cancel"
  template_gcs_path     = "gs://dataflow-templates-europe-west1/latest/flex/Cloud_Datastream_to_SQL"
  #template_gcs_path     = "gs://dataflow-templates/latest/flex/Cloud_Datastream_to_SQL"
  #template_gcs_path     = "gs://my-bucket-test-datastream-xxx/latest_flex_Cloud_Datastream_to_SQL"
  temp_gcs_location     = google_storage_bucket.tmp_dir_bucket.name
  service_account_email = google_service_account.default.email
  network               = google_compute_network.vpc-datastream.self_link
  subnetwork            = google_compute_subnetwork.subnet-datastream.self_link
  ip_configuration      = "WORKER_IP_PRIVATE"
  machine_type          = "n1-standard-1"

    parameters = {
      #gcsPubSubSubscription = "${google_pubsub_subscription.cdc_messages_subscription_new.id}"
      inputFilePattern = "gs://my-bucket-test-datastream-xxx/datastream/live-data"
      databaseType     = "mysql"
      databaseName     = "datastream"
      databaseHost     = "35.189.xx.xxx"
      databasePort     = "3306"
      databaseUser     = "datastream"
      databasePassword = "xxxx"
    }

}

 

 Output:

 

google_dataflow_job.dataflow_job: Creating...
╷
│ Error: googleapi: Error 400: Invalid template file gs://dataflow-templates-europe-west1/latest/flex/Cloud_Datastream_to_SQL. Please specify a correct Google-provided template path from https://cloud.google.com/dataflow/docs/guides/templates/provided-templates or create your own template following https://cloud.google.com/dataflow/docs/guides/templates/creating-templates, failedPrecondition
│ 
│   with google_dataflow_job.dataflow_job,
│   on 6_dataflow.tf line 28, in resource "google_dataflow_job" "dataflow_job":
│   28: resource "google_dataflow_job" "dataflow_job" {
│ 
╵
Operation failed: failed running terraform apply (exit 1)

 

I've even tried uploading the templates to a bucket itself and the error is the same, do not find the templates?

Any suggestions?

If the DB were private, where am I supposed to use the serverless vpc? in the network field and in the subnet I leave it as it is?

Thank you very much

 

 

It seems like you're facing two main issues: one related to the connection to the database and the other related to the template path in Terraform. Let's address both:

1) Connection to the Database

The error message "java.sql.SQLException: Cannot create PoolableConnectionFactory (The connection attempt failed.)" indicates that Dataflow is unable to establish a connection to the MySQL database. Here are some things to check:

  • Database Host and Port: Ensure that the IP address and port number are correct.
  • Firewall Rules: Check if there are any firewall rules that might be blocking the connection from Dataflow to the MySQL database.
  • Database Credentials: Verify that the username and password are correct and that the user has the necessary permissions to connect to the database.
  • MySQL Configuration: Ensure that the MySQL server is configured to accept connections from the IP addresses used by Dataflow.

2) Template Path in Terraform

The error message "Invalid template file gs://dataflow-templates-europe-west1/latest/flex/Cloud_Datastream_to_SQL" suggests that the template path is incorrect. The Dataflow templates are region-specific, and the path you are using seems to be correct for the europe-west1 region.

However, the Terraform google_dataflow_job resource might not support the flex templates directly. You may need to create a custom template and host it in your own GCS bucket.

Here's how you can do that:

  1. Create a Custom Template: Follow the instructions to create a custom template based on the Datastream to SQL code.
  2. Upload the Template to GCS: Upload the custom template to a GCS bucket that you control.
  3. Update the Terraform Code: Update the template_gcs_path in your Terraform code to point to the custom template in your GCS bucket.

Regarding Serverless VPC Acces

f the target DB were private, you would use Serverless VPC Access to allow Dataflow to connect to the private IP of the database. The network and subnetwork fields in your Terraform code would be used to specify the VPC and subnetwork that the Dataflow job should use. The Serverless VPC Access connector should be configured in the same VPC and region as the Dataflow job.

Thank you for you answer, here my comments with the problems

1) Database

Database Host and Port: are correct, ip public and port 3306
Firewall Rules: any rule blocking, in fact from my pc I can do a telnet 3306 and the connection is open
Database Credentials: user and pass correct, from outside I can connect
MySQL Configuration: with this database is public I've tried also with authorized network 0.0.0.0/0 so shouldn't be a problem with that

2) Templates
The point here is with gcloud command there is no problem with the path of templates, but with terraform there is.


I'm using the same path that exists:
gsutil ls gs://dataflow-templates-europe-west1/latest/flex/Cloud_Datastream_to_SQL
gs://dataflow-templates-europe-west1/latest/flex/Cloud_Datastream_to_SQL


About to create my custom templates, I downloaded the template and upload it to my bucket, is not like that?
Proof that I did well the copy (I guest) is that again with gcloud command it worked for me but with terraform I have the same problem 😞

That's why I don't know what to do with the templates to make it work in terraform, I don't know, can you test if the templates path works in terraform for you?


3) Serverless VPC Acces
About this let me share with your what are the ips addresses because I'm not sure how dataflow can connect with that ranges.


I have a single subnet on my vpc
subnet-datastream-stage = 192.168.26.0/28


I also have a private connection which is the one I used to create my private DB.
10.4.240.0/20

In private connection I also have the datastream network
10.0.0.0/29


And in the serverless I have this range
dataflow-serverless-vpc = 10.8.0.0/28

The question is if this serverless network I did well putting that range or it would have to be one of the 3 above.

Do you see well or I should put a specific range in the serverless vpc

Thank you very much

Hi,

More information about the problem with templates, here some tests, I'm pretty sure that the problem is with flex templates and terraform.


1) The one that should work but doesn't work with terraform
template_gcs_path = "gs://dataflow-templates-europe-west1/latest/flex/Cloud_Datastream_to_SQL"

google_dataflow_job.dataflow_job: Creating...
╷
│ Error: googleapi: Error 400: Invalid template file gs://dataflow-templates-europe-west1/latest/flex/Cloud_Datastream_to_SQL. Please specify a correct Google-provided template path from https://cloud.google.com/dataflow/docs/guides/templates/provided-templates or create your own template following https://cloud.google.com/dataflow/docs/guides/templates/creating-templates, failedPrecondition


2) Without region
template_gcs_path = "gs://dataflow-templates/latest/flex/Cloud_Datastream_to_SQL"

│ Error: googleapi: Error 400: Invalid template file gs://dataflow-templates/latest/flex/Cloud_Datastream_to_SQL. Please specify a correct Google-provided template path from https://cloud.google.com/dataflow/docs/guides/templates/provided-templates or create your own template following https://cloud.google.com/dataflow/docs/guides/templates/creating-templates, failedPrecondition


3) No flex in the url
template_gcs_path = "gs://dataflow-templates/latest/Cloud_Datastream_to_SQL"

google_dataflow_job.dataflow_job: Creating...
╷
│ Error: googleapi: Error 404: (2a2981b91f466db6): Unable to open template file: gs://dataflow-templates/latest/Cloud_Datastream_to_SQL., notFound


4) This happens with any template inside /flex whether or not I set the region.
template_gcs_path = "gs://dataflow-templates-europe-west1/latest/flex/BigQuery_to_Bigtable"

template_gcs_path = "gs://dataflow-templates/latest/flex/BigQuery_to_Bigtable"

google_dataflow_job.dataflow_job: Creating...
╷
│ Error: googleapi: Error 400: Invalid template file gs://dataflow-templates/latest/flex/BigQuery_to_Bigtable. Please specify a correct Google-provided template path from https://cloud.google.com/dataflow/docs/guides/templates/provided-templates or create your own template following https://cloud.google.com/dataflow/docs/guides/templates/creating-templates, failedPrecondition


5) With other template outside /flex
It finds it for me only that logically the parameters are not as expected, but it picks it up well.
template_gcs_path = "gs://dataflow-templates/latest/Word_Count"

google_dataflow_job.dataflow_job: Creating...
╷
│ Error: googleapi: Error 400: The template parameters are invalid.
│ Details:
│ [
│ {
│ "@type": "type.googleapis.com/google.dataflow.v1beta3.InvalidTemplateParameters",
│ "parameterViolations": [


template_gcs_path = "gs://dataflow-templates/latest/Cloud_Bigtable_to_GCS_Avro"

google_dataflow_job.dataflow_job: Creating...
╷
│ Error: googleapi: Error 400: The template parameters are invalid.
│ Details:
│ [
│ {
│ "@type": "type.googleapis.com/google.dataflow.v1beta3.InvalidTemplateParameters",
│ "parameterViolations": [
│ {
│ "description": "Missing required parameter",
│ "parameter": "bigtableProjectId"

 

List of templates in the bucket
https://console.cloud.google.com/storage/browser/dataflow-templates/latest

So, it seems to be something unique to flex templates, but I don't know how to fix it.

Any idea??

Thank you very much

The issue you're facing with Terraform and Dataflow Flex Templates seems to be specific to the way Terraform interacts with Flex Templates. Based on the information provided, here are some potential solutions and considerations:

  1. Check Terraform Version and Provider Version: Ensure that you are using a version of Terraform and the Google Provider that supports Dataflow Flex Templates. You may need to update to the latest version.

  2. Use a Different Resource Type for Flex Templates: Terraform has a specific resource type for Dataflow Flex Template Jobs (google_dataflow_flex_template_job). You may want to try using this resource type instead of the standard Dataflow job resource. Here's an example:

     
    resource "google_dataflow_flex_template_job" "dataflow_job" { // Configuration here }
  3. Verify Template Path: Ensure that the path to the Flex Template is correct and accessible. You may want to try downloading the template and uploading it to your own GCS bucket, then referencing that path in your Terraform configuration.

  4. Consider Creating a Custom Template: If the provided Flex Templates are not working with Terraform, you may want to consider creating a custom Flex Template that meets your specific requirements. You can follow the Google Cloud guide on creating custom templates.

  5. Consult Terraform Documentation and Community: Check the Terraform documentation for the Google Provider and consult the Terraform community (such as GitHub issues, forums, or Stack Overflow) for any known issues or workarounds related to Dataflow Flex Templates.

  6. Consider Using gcloud Command as a Temporary Workaround: If you are unable to resolve the issue with Terraform, you may want to consider using the gcloud command as a temporary workaround to deploy your Dataflow Flex Template Jobs. You can use a local-exec provisioner in Terraform to run the gcloud command.

Hi all!

Resurfacing this because I am having similar connectivity issues:


│ Error: Error waiting to create ConnectionProfile: Error waiting for Creating ConnectionProfile: {"@type":"type.googleapis.com/google.rpc.ErrorInfo","domain":"datastream.googleapis.com","metadata":{"message":"We timed out trying to connect to the data source. Make sure that the hostname and port configuration is correct, and that the data source is available.","originalMessage":"timeout expired\n","time":"2025-02-02T05:40:43.032118Z","uuid":"7ef2c48d-fe42-4fb3-be3b-c96566e7936d"},"reason":"CONNECTION_TIMEOUT"}
│ {"code":"VALIDATE_CONNECTIVITY","description":"Validates that Datastream can connect to the source database.","message":[{"code":"CONNECTION_TIMEOUT","level":"ERROR","message":"We timed out trying to connect to the data source. Make sure that the hostname and port configuration is correct, and that the data source is available.","metadata":{"original_error":"timeout expired\n"}}],"state":"FAILED"}

here is my config:

resource "google_datastream_private_connection" "private_connection" {
display_name = "Datastream Private Connection"
location = var.region
project = var.project_id
private_connection_id = "datastream-private-conn"

vpc_peering_config {
vpc = local.datastream_vpc
subnet = local.subnets_for_datastream
}
}

# Create VPC Peering between Datastream and Shared VPC
resource "google_compute_network_peering" "datastream_peering" {
name = "datastream-peering-${var.environment}"
network = local.private_networks_for_env
peer_network = "projects/${local.vpc_host_project_ids[var.environment]}/global/networks/servicenetworking-googleapis-com"
# Google's Managed Datastream VPC

import_custom_routes = true
export_custom_routes = true

depends_on = [google_datastream_private_connection.private_connection]
}
resource "google_datastream_connection_profile" "source" {
display_name = "CloudSQL Postgres Source"
location = var.region
project = var.project_id
connection_profile_id = "postgres-source-profile-${var.environment}"

postgresql_profile {
hostname = google_sql_database_instance.postgres_instance.private_ip_address
port = 5432
username = google_sql_user.postgres_user.name
password = random_password.db_password.result
database = google_sql_database.default_db.name
}

private_connectivity {
private_connection = google_datastream_private_connection.private_connection.id
}

depends_on = [
google_sql_database_instance.postgres_instance,
google_sql_database.default_db,
google_sql_user.postgres_user,
google_datastream_private_connection.private_connection
]
}

 @alexberry-mpb mentioned reverse proxy is the only way, Is that still accurate?
@Dani3 Would you please be able to share your private_connection / peering config for reference?

I am able to private access the private cloud sql from cloud run using similar peering but somehow datastream_connection_profile source fails to connect. 

Thank you! Any suggestions on what I would have been doing wrong would help!