Announcements
This site is in read only until July 22 as we migrate to a new platform; refer to this community post for more details.
Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Best practice for ssh'ing between GCE instances

I have two GCE instances, and I would like to be able to seamlessly ssh between them. /home is NFS mounted across the two, so it would be trivial to set up an authorized_keys solution, but I'm wondering if there are better ways.

It should be mentioned that both instances use OS Login.

I know that this can be achieved with gcloud compute ssh ... but various legacy applications we use expect a normal ssh to work.

0 1 265
1 REPLY 1

In order to SSH from one machine to another, the originating machine needs the SSH private key with the SSH public key being known to the destination.  You can define the set of public keys across a set of machines ... so what seems to remain is getting a copy of your private key onto each of the originating machines.  An NFS mount will work but I hear you look for a better way.  What would you like to happen?   I could imagine a possible startup script that runs when the VM boots and copies private keys from a GCS bucket into the local (non NFS shared) file system.