OK, if someone is trying to access the airflow db from the "managed" version of apache airflow by google, here is how I did it.
First of all, you need to find out what the host / user / password / port / name are.
In order to do that, you can export the list of connections using the airflow CLI.
gcloud composer environments run --location=YOUR_REGION YOUR_COMPOSER_ENV_NAME connections export -- - --format=env | grep airflow_db
The connection uri contains all the information you need.
Then, you need to setup kubernetes on your machine.
Find out the namespace in which your composer resources are deployed using
kubectl describe ns
(one of the namespaces is actually kind of the name of your composer version)
Then you need to identify a pod that has access to the airflow db, for instance the name of a scheduler pod:
kubectl get pods --namespace=YOUR_COMPOSER_NAMESPACE
Run a terminal in this pod using
kubectl exec --namespace=YOUR_COMPOSER_NAMESPACE -it YOUR_SCHEDULER_POD /bin/bash
Great, you are logged in the pod. Using psql, you are now able to access the airflow db (using the values retrieved in the first step)
psql -h HOST -U USER -p PORT -d DATABASE_NAME
Enjoy!