I had VMs created in a project.
Also added ssh keys via console - compute >> metadata >> SSH keys.
then tried SSH into one of those VMs and it worked. This VMs are used by few users and sometimes modify VM configuration when needed.
Now not sure why that SSH isn't working. But when i add ssh public key separately for that VM it is working fine. Why so?
Thanks in advance 🙂
Solved! Go to Solution.
Hi @SumanthBurla,
Welcome to Google Cloud Community!
Metadata can be configured at both the project and instance levels. Project-level metadata applies to all virtual machine instances in the project, while instance-level metadata affects only that specific instance. If the same key is set at both levels, Compute Engine will give precedence to the instance-level metadata.
This means:
If someone modifies the SSH keys at the instance level, those instance-specific keys will override any project-level keys. This could explain why SSH access works when you add the key directly to the instance but fails when relying on the project-level metadata. The issue might be due to the key being removed or changed at the project level, or because the instance-level keys are taking precedence.
The issue of inconsistent SSH access due to metadata refresh problems is significantly amplified when multiple users access the same VM. If users add or remove SSH keys directly to the VM, it can lead to conflicting configurations, making it difficult to manage access control. One user's action might inadvertently break SSH for others.
Ensure that the correct SSH key is added to both the project metadata and the instance metadata (if applicable), and verify that file permissions and network configurations are correct.
For more details, you may refer to the following documentation:
I hope the above information is helpful.
Hi @SumanthBurla,
Welcome to Google Cloud Community!
Metadata can be configured at both the project and instance levels. Project-level metadata applies to all virtual machine instances in the project, while instance-level metadata affects only that specific instance. If the same key is set at both levels, Compute Engine will give precedence to the instance-level metadata.
This means:
If someone modifies the SSH keys at the instance level, those instance-specific keys will override any project-level keys. This could explain why SSH access works when you add the key directly to the instance but fails when relying on the project-level metadata. The issue might be due to the key being removed or changed at the project level, or because the instance-level keys are taking precedence.
The issue of inconsistent SSH access due to metadata refresh problems is significantly amplified when multiple users access the same VM. If users add or remove SSH keys directly to the VM, it can lead to conflicting configurations, making it difficult to manage access control. One user's action might inadvertently break SSH for others.
Ensure that the correct SSH key is added to both the project metadata and the instance metadata (if applicable), and verify that file permissions and network configurations are correct.
For more details, you may refer to the following documentation:
I hope the above information is helpful.
Hi @JuatonCJ, Thanks for the detailed information. It is indeed helpful.
Now i got into situation where, few VMs are SSH modified instance level and project level SSH keys aren't working this make sense for what you said. i want them back to normal. i did removed all the keys in those few VMs and made it back but still project level keys aren't working. Again, it is working if i add them instance level -_- can i make those VMs work with project level back? i tried with a test VMs this it seems once instance level is on place project level can't be used anymore. is it like that?
PS: really appreciate your help 🙂