Announcements
This site is in read only until July 22 as we migrate to a new platform; refer to this community post for more details.
Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

403 error on kubeflow custom training pipeline when adding Experiment support

My setup:

I have a kubeflow pipeline that I have used to run custom training jobs in Vertex before. I am using a custom service account with aiplatform.user permissions. 

I am trying to add support for metric logging via Vertex Experiments, but the pipeline fails on the call to aiplatform.init(experiment="name") inside my training script.

The error:

google.api_core.exceptions.Forbidden: 403 GET https://us-central1-aiplatform.googleapis.com/v1/projects/alist-staging/locations/us-central1/metada...: Request had insufficient authentication scopes. [{'@type': 'type.googleapis.com/google.rpc.ErrorInfo', 'reason': 'ACCESS_TOKEN_SCOPE_INSUFFICIENT', 'domain': 'googleapis.com', 'metadata': {'service': 'aiplatform.googleapis.com', 'method': 'google.cloud.aiplatform.v1.MetadataService.GetMetadataStore'}}]

What I have tried:

  • I have confirmed that my service account has aiplatform.user permissions, which allegedly covers metadataStore. I have added storage.Objects.* permission as well, and confirmed that the project's default compute account has the same permissions.
  • I tried to use the same service account to run the job from Google Colab, and was successful! But as soon as I try to replicate this in my kubeflow pipeline, I get permission errors.

My leading theory is that something is wrong with how I am authenticating with aiplatform within my training script, but I can't find any documentation on how to set this up correctly in kubeflow. Please help me, I am so stumped! 

0 2 226