No route to host on manually created PSC endpoint connected to Cloud Workstations

Hi,

I'm in the process of setting up Cloud Workstations to only be accessible via VPN (so not publicly exposed) and while I've had successes to get workstations running the way it was intended, I'm starting to get frustrated with what seems to sometimes happen when the IP connected to the manually seyt-up Private Service Connect endpoint is not reachable and I get a "No route to host" error message when trying to manually test the connectivity.

This already happened previously and I thought I had fixed it by undoing IAM-related changes I had made but now, with no changes made whatsoever between a cluster that used to work just fine and a new cluster that was re-created with exactly the same configuration, I get "No route to host".

Some explanations:

  • Under Private Service Connect I have two endpoints:
    • One manually configured that takes an internal IP on subnet A. This is the one that I can't reach (on port 443, which is basically what "launching" the workstation does). If I try to test the connectivity to this endpoint on a different machine from the same subnet (configured outside of Cloud Workstation), I also don't get any connectivity, hinting that the machine behind that endpoint is somehow not working as expected.
    • One automatically configured that also takes an internal IP on subnet B
    • Both are accepted endpoints.
  • It does not matter whether I use SSH or use HTTPS via the Launch button, this is just not working because there is absolutely no way to reach the IP address.
  • Terraform is used to set this up which is why I'm certain when I say that nothing is changing between cluster deletions/recreations.

I do want to really emphasize that our setup works just fine since a couple of days ago I was able to connect to the workstations I was creating just fine.

Any ideas?

0 0 58
0 REPLIES 0