Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Async REST credentials for Vertex AI

I have no clue on what's going on right here. And there is no documentation about it anywhere. Can someone help?

I'm making asyncronous API calls to Gemini 1.5 flash 002 on Vertex AI. On each call, I get the following log:

```

initializer - REST async clients requires async credentials set using aiplatform.initializer._set_async_rest_credentials().
Falling back to grpc since no async rest credentials were detected.
```

This is how I instantiate my Credentials:

```

from google.oauth2 import service_account
import vertexai
credentials =  service_account.Credentials.from_service_account_info({...})
vertexai.init(credentials=credentials)

```

 

I also use the `achat` method of Llamaindex library to call VertexAI.

 

Why does the log suggest me to call a Private method on Googles library? Seems confusing and wrong. Can someone give me support on that?

Solved Solved
0 1 2,310
1 ACCEPTED SOLUTION

Hi @arthurbrcuni,

Welcome to Google Cloud Community!

The log message incorrectly references an internal implementation detail of the Google Cloud Vertex AI client library. Addressing this issue by directly calling aiplatform.initializer._set_async_rest_credentials() is not recommended, as it is a private method. Using it is unsupported and may cause issues in future updates. The core problem is that the Vertex AI client library prefers REST calls for better performance; however, the code isn't properly configuring the required asynchronous credentials, causing it to fall back to the less-optimized gRPC.

Why it's happening:

The method aiplatform.initializer._set_async_rest_credentials() likely requires a specific credential type for asynchronous operations within the Vertex AI library's internal workings. However, the service_account.Credentials.from_service_account_info(...) object, while valid for authentication, may lack the necessary metadata, causing the library to detect this absence and fall back to gRPC.

Here are some potential ways to address your issue:

  • Use the latest versions: Make sure to update your google-cloud-aiplatform and potentially llamaindex packages. The latest versions may better handle credentials and reduce reliance on this internal fallback mechanism.
  • Specify the transport method: The Vertex AI client library usually lets you directly specify the transport method (gRPC or REST). Review the documentation for the specific achat method in the llamaindex library. Search for parameters related to communication or transport. There may be a way to explicitly enforce the use of gRPC, completely sidestepping the issue.
  • Check Llamaindex's Setup: Review the LlamaIndex documentation carefully. There may be settings that define how it interfaces with the underlying Vertex AI API (e.g., setting the transport method or managing credentials).
  • Simplify the testing process: You may develop a minimal example using only google-cloud-platform to interact with your Gemini model and call a basic endpoint directly to check if the same log message appears, thereby isolating whether the issue lies with LlamaIndex or Vertex AI.

If you've tried all other options, the best step is to create a support ticket with Google Cloud Support. They have the expertise in the library's internal workings and can provide definitive guidance. Be sure to include the complete error log, your code snippet, and the versions of all libraries you're using.

In short, disregard the guidance to call the private method. The issue is rooted in how the credentials are being passed to the asynchronous REST client within the Vertex AI library's internal mechanism. Your best approach is to focus on defining the transport method explicitly, using the latest library versions, and streamlining the process to reproduce the error.

I hope the above information is helpful.

View solution in original post

1 REPLY 1

Hi @arthurbrcuni,

Welcome to Google Cloud Community!

The log message incorrectly references an internal implementation detail of the Google Cloud Vertex AI client library. Addressing this issue by directly calling aiplatform.initializer._set_async_rest_credentials() is not recommended, as it is a private method. Using it is unsupported and may cause issues in future updates. The core problem is that the Vertex AI client library prefers REST calls for better performance; however, the code isn't properly configuring the required asynchronous credentials, causing it to fall back to the less-optimized gRPC.

Why it's happening:

The method aiplatform.initializer._set_async_rest_credentials() likely requires a specific credential type for asynchronous operations within the Vertex AI library's internal workings. However, the service_account.Credentials.from_service_account_info(...) object, while valid for authentication, may lack the necessary metadata, causing the library to detect this absence and fall back to gRPC.

Here are some potential ways to address your issue:

  • Use the latest versions: Make sure to update your google-cloud-aiplatform and potentially llamaindex packages. The latest versions may better handle credentials and reduce reliance on this internal fallback mechanism.
  • Specify the transport method: The Vertex AI client library usually lets you directly specify the transport method (gRPC or REST). Review the documentation for the specific achat method in the llamaindex library. Search for parameters related to communication or transport. There may be a way to explicitly enforce the use of gRPC, completely sidestepping the issue.
  • Check Llamaindex's Setup: Review the LlamaIndex documentation carefully. There may be settings that define how it interfaces with the underlying Vertex AI API (e.g., setting the transport method or managing credentials).
  • Simplify the testing process: You may develop a minimal example using only google-cloud-platform to interact with your Gemini model and call a basic endpoint directly to check if the same log message appears, thereby isolating whether the issue lies with LlamaIndex or Vertex AI.

If you've tried all other options, the best step is to create a support ticket with Google Cloud Support. They have the expertise in the library's internal workings and can provide definitive guidance. Be sure to include the complete error log, your code snippet, and the versions of all libraries you're using.

In short, disregard the guidance to call the private method. The issue is rooted in how the credentials are being passed to the asynchronous REST client within the Vertex AI library's internal mechanism. Your best approach is to focus on defining the transport method explicitly, using the latest library versions, and streamlining the process to reproduce the error.

I hope the above information is helpful.