So recently I almost lost my mind due to this problem, so I hope I may find a solution for it, I have a GCP Project in which it has the following:
an External VPC Network, that contains an External Load Balancer with a static public IP Address.
an Internal VPC Network, that contains a Private GKE Cluster.
I deployed a FortiGate Firewall VM with two network interface cards, one for the external and the other for the internal VPC networks.
To ensure that my work is fine, I deployed a small Linux VM and installed apache2 on it (the VM doesn't have a public IP Address).
I managed to create the routes, policies and configuration on both Google Cloud Console and the FortiGate Firewall, and I was successful in accessing the apache2 server from the static IP address of the external load balancer.
However when I deployed an nginx server on the cluster, and tried to access the web server from the public IP address, it always fail (I did config the FW policies ofcourse).
Note: NAT is enabled on the FortiGate Firewall.
If you want more info feel free to ask me!
Hi @m7md001 ,
Based on the information that you've given, the issue seems to be the FortiGate Firewall is not aware of the private IP address that the GKE cluster is using. You can try configuring the Fortigate Firewall to allow traffic from the external IP address to the internal IP address of the GKE cluster.
1. You have to find first the internal IP address of the GKE cluster :
kubectl get services --all-namespaces -o jsonpath='{.items[?(@.spec.type=="LoadBalancer")].spec.clusterIP}'
2. Then, create a policy on the FortiGate Firewall to allow traffic from the external IP address to the internal IP address of the GKE cluster.
# Replace <external_ip> with the external IP address of the FortiGate Firewall
# Replace <internal_ip> with the internal IP address of the GKE cluster
# Replace <port> with the port number of the nginx server
# Create a policy to allow traffic from the external IP address to the internal IP address of the GKE cluster
sudo fortigate-cli -c 'config firewall policy' -m 'set srcintf "wan1"' -m 'set dstintf "lan1"' -m 'set srcaddr "<external_ip>/32"' -m 'set dstaddr "<internal_ip>/32"' -m 'set action accept' -m 'set service "tcp <port>"' -m 'set schedule "always"' -m 'set nat enable' -m 'set comments "Allow traffic from external IP address to GKE cluster"'
# Create a policy to allow traffic from the GKE cluster to the external IP address
sudo fortigate-cli -c 'config firewall policy' -m 'set srcintf "lan1"' -m 'set dstintf "wan1"' -m 'set srcaddr "<internal_ip>/32"' -m 'set dstaddr "<external_ip>/32"' -m 'set action accept' -m 'set service "tcp <port>"' -m 'set schedule "always"' -m 'set nat enable' -m 'set comments "Allow traffic from GKE cluster to external IP address"'
3. Lastly, create a route on the FortiGate Firewall to route traffic from the external IP address to the internal IP address of the GKE cluster.
# Replace <external_ip> with the external IP address of the FortiGate Firewall
# Replace <internal_ip> with the internal IP address of the GKE cluster
# Create a route to route traffic from the external IP address to the internal IP address of the GKE cluster
sudo fortigate-cli -c 'config router static' -m 'set dst "<external_ip>/32"' -m 'set gateway "<internal_ip>"' -m 'set device "wan1"' -m 'set comments "Route traffic from external IP address to GKE cluster"'
Let me know if this works.
Hi Marvin, thank you for your reply, I have solved the problem, the problem was at my FG FW port forwarding was configured in a wrong way.
Since I have an external passthrough load balancer running at port 80, at the FG FW I had the port of the external server wrong (it was the IP address of the service which is not 80)
Thanks again for your great answer!