Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Dataproc Not able to submit job

Hi GCP community, my org has strict policies on granting project level permission to any user group or service account. Basic principle is we need to grant access only in resource level.
We use tf to create our infrastructure ,
So I am currently able to create dataproc cluster, and I was also able to bind the IAm role as dataproc editor to the dataproc cluster I just created. 
This is the code I used to add the role at resource level

resource "google_dataproc_cluster_iam_binding" "dataproc_editor" {
depends_on = [google_dataproc_cluster.bob_cluster]
cluster = var.cluster_name
role = "roles/dataproc.editor"
members = [
"group:gcp-usrgroup-eng@myorg.com"
]
}

I am part of this user group so technically I should have resource level permission for userid. 
I am able to view cluster I created which I was not able to do before. but the job submitting part is still not visible in the UI.

I tried submitting through gcloud cmd and it gives this error :

ERROR: (gcloud.dataproc.jobs.submit.spark) PERMISSION_DENIED: Not authorized to requested resource

Although I can use the Auto Job submission using terraform ,
resource "google_dataproc_job"

But i want to understand why i was not able to submit job on the cluster where  have resource level editor role.

0 3 470
3 REPLIES 3

Hi @karki,

Welcome to Google Cloud Community!

Below are the potential cause and its corresponding solutions:

  • The "dataproc.editor" role only provides access to manage Dataproc configurations, but it doesn't allow you to submit jobs. To submit a Dataproc job, you need the dataproc.jobs.create and dataproc.clusters.use permissions.
  • If you are using a service account to manage Terraform, ensure it has the necessary permission for a Dataproc job submission. 

I hope the above information is helpful.

Error: Error applying IAM policy for Dataproc Cluster my-env-project/us-east4/bob-cluster-init3: Error setting IAM policy for Dataproc Cluster my-env-project/us-east4/bob-cluster-init3: googleapi: Error 400: Role roles/dataproc.clusters.use is not supported for this resource., badRequest
Error: Error applying IAM policy for Dataproc Cluster my-env-project/us-east4/bob-cluster-init3: Error setting IAM policy for Dataproc Cluster my-env-project/us-east4/bob-cluster-init3: googleapi: Error 400: Role roles/dataproc.jobs.create is not supported for this resource., badRequest
I tried adding those role but looks like adding those roles at resource level are not supported.
Do you have any other suggestion as adding project level permission is out of scope for me.

 

Hi @karki,

You’re absolutely right that both roles can only be assigned on a project. At this time, there isn’t a formal document detailing the roles at the resource level for job submissions.

Regarding this, you can submit a feature request. Please note that I cannot specify when this enhancement will be implemented. For future updates, I recommend monitoring the issue tracker. 

I hope the above information is helpful.