Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Creating Workspace in DataBricks

We've recently subscribed to a new Databricks subscription in the marketplace. However, even when I'm the admin of the databricks account and I'm an owner of the GCP project associated with this account, I'm still getting a 400 error that says admin_policy_enforced (see screenshot attached)

What's going on here?Capture.PNG

0 1 281
1 REPLY 1

The error you're encountering, admin_policy_enforced, is typically associated with the Google Cloud Identity and Access Management (IAM) system, specifically when the IAM policy set by the admin does not allow the action you're trying to perform.

Here are a few possible reasons for this error:

  • IAM Role Limitation: You may have the "Owner" role, but that doesn't necessarily mean you have all the permissions required to perform certain actions. The Owner role is a predefined role with a set of permissions. It's possible that Databricks requires some specific permissions that are not included in the Owner role.

  • Organization Policy Constraints: Your organization might have set up certain policy constraints at the Organization level in GCP which is overriding the project level permissions.

  • Service Account Permissions: If you're using a service account to authenticate with Databricks, ensure that the service account has the necessary permissions. In some cases, actions are performed by the underlying service account and not the actual user.

  • API Enablement: Ensure that the necessary APIs for Databricks integration are enabled in your GCP project.

To resolve this issue, you can try the following steps:

  1. Check IAM Roles: Review the IAM roles assigned to your account and ensure that they include all necessary permissions. You may need to create a custom IAM role if necessary.

  2. Review Organization Policies: Check with your GCP Organization admin to see if there are any organization-level policies that might be causing the issue.

  3. Service Account Permissions: Review the permissions of the service account that's being used for the Databricks integration.

  4. Enable APIs: Ensure that the necessary APIs for Databricks are enabled in your GCP project.