Pods deployed in GKE are keeps restarting without any error. current gke version is v1.24.0
Application inside docker is running fine and there is no issue with health probes and there are no events as well.
Even though pod status is showing as crashloopbackoff error ican still login and check the app is running.
Tried to restart the worker node, upgraded the cluster to v1.25 but still we have the issue.
Hi @sridharakb05,
Welcome to the Google Cloud Community!
Please provide more information by executing the following commands: kubectl describe po <pod-with-error>
and kubectl logs <pod-with-error>
.
Addressing the "CrashLoopBackOff" error might also rectify the pod restarts [2]. To further troubleshoot the pod restarts, consider following the steps mentioned in this Stack Overflow thread.
Furthermore, ensure that appropriate resource limits are established in your deployment, as this is a frequent cause of issues.
For direct inspection of your project, you can reach out to Google Cloud Support. Thank you!
[1]. https://stackoverflow.com/questions/68834246/kubernetes-pods-status-is-crashloopbackoff-but-no-logs-...
[2]. https://cloud.google.com/kubernetes-engine/docs/troubleshooting#CrashLoopBackOff
[3]. https://stackoverflow.com/questions/75049951/in-a-google-cloud-kubernetes-cluster-my-pods-sometimes-...