Hi Community,
I'm setting up a new personal project to use the Vertex AI Gemini API with Python, and I'm running into persistent permission issues despite having the Owner role. I'd appreciate any insights!
My Goal:
My Setup:
Problem & Errors Encountered:
Python Script Error: When running a simple Python script (code below) to call model.generate_content(), it fails with:
An error occurred: 400 Project '804321510322' is not allowed to use Prediction API on '.../publishers/google/models/[MODEL_ID]'
(This happened for both gemini-1.5-flash-001 and gemini-1.0-pro-002 models).
gcloud Command Error: When trying to troubleshoot by ensuring APIs are enabled via gcloud, running gcloud services enable compute.googleapis.com fails with:
ERROR: (gcloud.services.enable) PERMISSION_DENIED: Not found or permission denied for service(s): [compute.googleapis.com] reason: SERVICE_CONFIG_NOT_FOUND_OR_PERMISSION_DENIED
Contradiction / Confusion:
Troubleshooting Steps Tried:
Question: Why would I be getting these permission errors (PERMISSION_DENIED for enabling services, 400 Project not allowed for prediction) when I have the Owner role and the relevant APIs appear enabled in the console? Is there potentially an issue with the project state, free trial limitations, or an Organization Policy I might not be aware of (though this should be a personal project)?
Any help or suggestions would be greatly appreciated!
(Optional: Include the Python Code)
# Import the Vertex AI library import vertexai from vertexai.generative_models import GenerativeModel # --- Configuration --- PROJECT_ID = "woven-art-456200-i9" LOCATION = "us-central1" MODEL_ID = "gemini-1.0-pro-002" # Also tried gemini-1.5-flash-001 # --- Initialize Vertex AI --- vertexai.init(project=PROJECT_ID, location=LOCATION) # --- Load the Gemini Model --- model = GenerativeModel(MODEL_ID) # --- Send a Prompt --- prompt = "In simple terms, what is a large language model?" print(f"Sending prompt: '{prompt}'") try: response = model.generate_content(prompt) # --- Print the Response --- print("\n--- Model Response ---") if response.candidates and response.candidates[0].content.parts: print(response.candidates[0].content.parts[0].text) else: print("No text content found in the response.") # print(response) # Uncomment to see full response structure if needed except Exception as e: print(f"\nAn error occurred: {e}") print("\n--- Script Finished ---")
Solved! Go to Solution.
Hi @helloworld2,
Welcome to Google Cloud Community!
The issue is caused by free trial limitations. Even with the Owner role and all necessary APIs enabled, Vertex AI access to Gemini models is restricted by default for trial accounts. The error: 400 Project '...' is not allowed to use Prediction API...
means your project hasn’t been whitelisted to use these models — a common limitation for free-tier setups. Models like gemini-1.5-flash-001
and gemini-1.0-pro-002
require elevated permissions that are not automatically granted to trial accounts, even if billing appears active through credits.
For more in-depth analysis, you can contact Google Cloud Support. When reaching out, include detailed information and relevant screenshots of the errors you’ve encountered. This will assist them in diagnosing and resolving your issue more efficiently.
Was this helpful? If so, please accept this answer as “Solution”. If you need additional assistance, reply here within 2 business days and I’ll be happy to help.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |