Announcements
This site is in read only until July 22 as we migrate to a new platform; refer to this community post for more details.
Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

PosgreSQL to BigQuery Connection

I can't seem to connect the PostgreSQL source to BigQuery using Data Transfer Service and/or Data Stream

I already have the connection details as I have linked it directly to Looker Studio. However, it would be great if we also have it in BigQuery as possibilities are limitless. As mentioned, I already have the credentials (Username, Password, Host, Database name, Port) and the certificates and key (in .pem files). I only have the said credentials and files as the PosgreSQL source is managed by our affiliate.

Attempt 1. via Data Transfer Service

  • I have tried filling out the information and the credentials but there is no way to upload the certificates. Which is why (I think) there's an error when trying to proceed or connect.

Attempt 2. via Data Stream

  • I also tried creating a stream via Data Stream. Again, filled out the necessary information. We also created a connection profile where the credentials are needed but there's no option to upload the certificates?

I'm quite new to GCP and I also can't find a helpful step-by-step or how to on this topic. Please help.

0 3 362
3 REPLIES 3

Hi @MacLopez,

Welcome to Google Cloud Community!

Google's Data Transfer Service (DTS) and Datastream for PostgreSQL don't have a straightforward UI option to upload client-side SSL certificates (.pem files) required for authentication or mutual TLS (mTLS).

These services generally handle server-side SSL verification (ensuring the GCP service connects to the correct PostgreSQL server using its certificate) but often rely on other methods like username/password, IP allowlisting, or network-level security (VPC Peering, VPN, SSH Tunnels) for the actual connection and client authentication, rather than client certificate authentication directly configured within the service UI.

Since your affiliate requires these client certificates for the connection, the standard DTS/Datastream UI flows won't work out-of-the-box for your specific setup.

For workarounds, here’s a related list of cases and documentation that you may find useful.

Was this helpful? If so, please accept this answer as “Solution”. If you need additional assistance, reply here within 2 business days and I’ll be happy to help.

Apologies for the late response on this. Based on your reply, I gather that in order for me to proceed, our affiliate may need to intervene with the process as to 'not require' the client certificates for the connection?

Thanks so much!

Hi @MacLopez Thanks for sharing your situation , this is actually a fairly common question when working with externally managed PostgreSQL databases that require secure connections using certificates (.pem files).

What’s happening:


The BigQuery Data Transfer Service (DTS) currently does not support certificate-based authentication for PostgreSQL. It only allows basic user/password authentication, with no option to configure custom TLS or upload .pem files.

While DataStream does offer more secure connection options, support for PostgreSQL is somewhat limited , especially if your server enforces certificate-based authentication. Uploading .pem files directly through the UI isn't currently supported, which can be a challenge for environments with strict security requirements (which seems to be your case).

Possible solutions:

Option 1: Use an intermediate layer like a bastion host or Cloud SQL Auth Proxy

One workaround is to deploy a small GCE VM or a bastion host that can securely connect to your PostgreSQL using the .pem certificate. From there, you can set up an ETL job to move data into BigQuery.
This approach works, but it does require managing infrastructure and custom scripts.

Option 2: Use an ETL platform that supports secure PostgreSQL connections


A simpler approach may be using a platform like Windsor.ai, which supports connecting to externally managed PostgreSQL instances — even those requiring .pem-based authentication — and automatically syncing the data to BigQuery.
It also supports incremental updates and reduces the need for manual config or infrastructure.

Final recommendation:


If you need to keep the secure .pem connection and want to avoid building additional infrastructure, I’d recommend looking into an ETL platform that natively supports secure PostgreSQL connections.

If you’d rather stick with DataStream, you could also consider setting up a VM with secure access to the database and use Dataflow to push the data into BigQuery. Hope this helps!