“Perform Foundational Data, ML, and AI Tasks in Google Cloud” challenge lab error on Dataprep job

Hello there guys! How are you doing?

This is Mike, a data scientist working on Telecom Argentina.

Since we are a partner from Google Cloud, I solved the requested labs and quests on the partner.cloudskillsboost.google page to receive the Voucher for the free Data Engineering certification exam.

I am only missing the challenge lab “Perform Foundational Data, ML, and AI Tasks in Google Cloud” which I already performed 3 times and I’m always getting the same technical error.

In the task number 3, I develop the Dataprep Job requested exactly as the lab asks for, and the job is triggered from dataprep to dataflow.

Here is where I encounter the error, the job fails all the time, with the following error message:

 

Root cause: Output filename cannot contain wildcards: /bigstore/dataprep-staging-f19835d4-7e89-4200-b4c8-d382223e74d1/student-02-8d23c41cde62@qwiklabs.net/temp/dax-tmp-2023-04-14_20_16_03-462373969259352813-S23-0-9740bbb032767ce6/tmp-9740bbb032767919-shard--try-01b6435418a2ffa6-endshard.sdfmeta Worker ID: cloud-dataprep-runs-flow--04142016-oeg7-harness-p96d

 

 

It seems like something is missing in the environment configured for the dataflow job. The thing is, in the lab , I don’t have dataflow developer or dataflow administrator permissions, so I’m not able to edit the temporal files the job run creates.

I checked on google and ChatGPT for these errors and I found that some people are encountering the same issue, but some not.

Maybe it’s a technical problem of the configuration of the lab.

Could you help me with this error?

Or can you help me to get this lab completed so I can receive the voucher and select a date for the exam?

Thanks in advance!

Best,

Mike

0 REPLIES 0
Top Labels in this Space