For all of us who are taking a break we have the weekly challenges that started
Who's up for a challenge? Time to show off your #GoogleClout!
Starting today, check in every Wednesday to unlock a new cloud puzzle that will put your cloud skills to the test against participants from around the world. Stephanie Wong's previous record is 5 minutes, can you complete the new challenge in 4?
The #GoogleClout Challenge is a free 20-minute weekly hands-on challenge. Every Wednesday for the next 10 weeks, a new challenge will be posted on our website. Participants will compete against the clock to see how fast they can complete the challenge. Try the 20 minute challenge as many times as you like. The faster you go, the higher your score!
For all of us who are taking a break we have the weekly challenges that started this Wednesday
How does it work
To participate, follow these four easy steps:
Sign Up – Go to our website, click on the link to the weekly challenge and sign up for the mission with your Google Cloud Skills Boost account.
Play – Try the challenge up to 5 times. Remember that the faster you are, the higher your score will be!
Share – Share your scorecard on Twitter/LinkedIn using #GoogleClout
Earn – Complete all 10 weekly challenges to earn exclusive #GoogleClout badges
Ready to get started?
Take the #GoogleClout challenge today!
Continuing with the remains -- this week
Here is the link to register for the challenge.
#GoogleClout Set 2 (1/10) registration here
#GoogleClout Set 3 (2/10) registration here
#GoogleClout Set 4 (3/10) registration here
#GoogleClout Set 5 (4/10) registration here
#GoogleClout Set 6 (5/10) registration here
#GoogleClout Set 7 (6/10) registration here
#GoogleClout Set 8 (7/10) registration here
#GoogleClout Set 9 (8/10) registration here
#GoogleClout Set 10(9/10) registration here
#GoogleClout Set 11(10/10) registration here
Stay tuned for new updates to this post on Wednesdays 😉
Had some fun with the #GoogleClout Set 4 acknowledgement part
Can someone please guide how to perform this week GoogleClout challenge. I created log based metric for crashing and logging and then created the alert also but the alert is not triggering. Not sure what I am missing
You should use the sum operation with a threshold of 1 to ensure the alert fires as quickly as possible.
Not sure still what I am missing. Can you mention the steps that you have followed. I am getting only 25 score so definitely I am missing something
you might follow the steps 🙂
Thank you. That helped 🙂
@fuchengyen very nice and so well explained 🥇
I cut try to do a general logging so this is my propose to find logs:
Crashing
resource.type="k8s_container"
resource.labels.namespace_name="default"
resource.labels.container_name="ubuntu"
resource.labels.pod_name=~"crashing-*"
resource.labels.cluster_name="log-metric"
Logging
resource.type="k8s_container"
resource.labels.namespace_name="default"
resource.labels.container_name="ubuntu"
resource.labels.pod_name=~"logging-*"
resource.labels.cluster_name="log-metric"
Best,
Greg
thanks @GrzWas77 for showing how to use general logging
it helps to reduce lot's of steps.... :):):)
Nice explination that helped me a lot.
Hello every one
This is my results
#GoogleClout
Google Clout - Alerting on Log Based Metrics Challenge.
0️⃣5️⃣ 🟦🟦🟥🟥
1️⃣0️⃣ 🟦🟦🟦🟦
1️⃣5️⃣ ⬜⬜⬜⬜
2️⃣0️⃣ ⬜⬜⬜⬜
Challenge completed in 10 minutes and 34 seconds !
https://www.cloudskillsboost.google/quests/217
Until now I did not understand how I can increase my score in this challenge ????
Here's my score in the very first attempt.
#GoogleClout
hey @Richard_Rivero ,
i have completed this https://www.cloudskillsboost.google/quests/217
thank you share this time challenge
dataflow always costs lot's of time... 🙁
and it's hard to debug from logging.
Hi everyone 😎
Posted new challenge this week.🏆
#GoogleClout
Google Clout - Change Streams for Cloud Spanner Challenge.
0️⃣5️⃣🟦🟥🟥🟥
1️⃣0️⃣🟦🟦🟦🟥
1️⃣5️⃣🟦🟦🟦🟦
2️⃣0️⃣⬜⬜⬜⬜
Challenge completed in 12 minutes and 33 seconds !
https://www.cloudskillsboost.google/quests/203
But streaming to bigquery was taking time.. It was unable to transform json to DML and move that to table in Bigquery
I've change solving order 3 -> 1 -> 2 -> 4, it may help shrink more time.
PS: dataflow may take around two minutes to start. you have time to finish other steps.
Try No.1
#GoogleClout
Google Clout - Change Streams for Cloud Spanner Challenge.
0️⃣5️⃣ 🟦🟦🟥🟥
1️⃣0️⃣ 🟦🟦🟦🟦
1️⃣5️⃣ 🟦🟦🟦🟦
2️⃣0️⃣ ⬜⬜⬜⬜
Challenge completed in 10 minutes and 27 seconds !
Thank you @fuchengyen that was useful here
#GoogleClout
Google Clout - Change Streams for Cloud Spanner Challenge.
0️⃣5️⃣ 🟦🟦🟥🟥
1️⃣0️⃣ 🟦🟦🟦🟦
1️⃣5️⃣ ⬜⬜⬜⬜
2️⃣0️⃣ ⬜⬜⬜⬜
Challenge completed in 8 minutes and 2 seconds !
Try No.3
#GoogleClout
Google Clout - Change Streams for Cloud Spanner Challenge.
0️⃣5️⃣ 🟦🟦🟦🟥
1️⃣0️⃣ 🟦🟦🟦🟦
1️⃣5️⃣ ⬜⬜⬜⬜
2️⃣0️⃣ ⬜⬜⬜⬜
Challenge completed in 7 minutes and 13 seconds !
Hi,
for someone who struggles in CloudClout Set 5(4/10)-Change Streams for Cloud Spanner Challenge
you can look at the following references, which help me a lot.
[1] https://cloud.google.com/blog/products/spanner/change-streams-for-cloud-spanner-now-generally-availa...
[2] https://www.youtube.com/watch?v=fwyLNP3RtCs&t=318s&ab_channel=GoogleCloudTech
@Bholoubi Thanks for the feedback 😎
a small hint for someone who challenge to be finished within 5 minutes.
(change the machine type of dataflow worker😉)
#GoogleClout
Google Clout - Change Streams for Cloud Spanner Challenge.
0️⃣5️⃣ 🟦🟦🟦🟦
1️⃣0️⃣ ⬜⬜⬜⬜
1️⃣5️⃣ ⬜⬜⬜⬜
2️⃣0️⃣ ⬜⬜⬜⬜
Challenge completed in 4 minutes and 56 seconds !
https://www.cloudskillsboost.google/quests/203
Thank you it worked with me.
Hello every one.
The Challenge (#GoogleClout Set 5(4/10))
---
I solve this challenge after I reviewed this links from @You-Jun
[1]
[2]
, also I followed the steps from @fuchengyen that helps me shrink more time
First step3:
// I run this line in Cloud Shell Editor
gcloud config set compute/region [dynamically selected lab startup]
Then I opened Jobs dataflow and create the job streamjob . [1]
Second step 1:
//run the following in Compute Shell
//creating the spanner instance
gcloud spanner instances create [Cloud Spanner Meta Instance] \
--config=regional-[dynamically selected lab startup] \
--description="Spanner Meta" \
--nodes=1
//create the spanner databases
gcloud spanner databases create [Cloud Spanner Meta Database] --instance=[Cloud Spanner Meta Instance]
//creating the dataset in BQ
bq --location=[dynamically selected lab startup] mk [BigQuery Dataset]
Third step 2:
Create the Cloud Spanner change stream named ordersstream on the orders table.
CREATE CHANGE STREAM ordersstream FOR orders;
return to the jobs and see if the dataflow start running .
Forth step 4:
if dataflow start running then I open Cloud Spanner Query console and insert data into orders table.
INSERT INTO orders(
OrderID,
CustomerID,
OrderDate,
Price,
ProductID)
VALUES(123, -- type: INT64
456, -- type: INT64
'2022-04-26', -- type: DATE
99, -- type: INT64
789 -- type: INT64
);
when you open BQ you will find the rows in the table.
Good luck.
=========
It solves 50% creating job is bit confusing what to do about that?
Thank you @fuchengyen for explaining step 3.
Awesome worked
I am getting error like this. Any idea what is going on
Hello @Maragatham , I've meet the error once, but it's hard to debug even I walk through logging deeper. That's why I mentioned in early post.
The solution I found is redo the dataflow job again. The name of the job can be something else, not "streamjob".
PS: Sorry I could not get screenshot, I've reached the game play limit (only 5 times retries)
I create a new account, redo the google clout game #4, get error message and figure out how to debug. We need to scroll up more to find java exception message. In this example (refer to photo), I am too late to create the change stream... java exception message mentioned.
Hello @fuchengyen @Maragatham
I have a question , so in case of error it means that the Spanner change stream did not read the new raw that I insert through the query step 4
<a href:"https://www.googlecloudcommunity.com/gc/Learning-Forums/New-Challenge-Show-off-your-cloud-skills-by-..."> step 4 </a>
Hi @Bholoubi ,
The error message i provided occurs as I'm creating the dataflow job.
because my steps order is 3-> 1 -> 2 -> 4, I am too late to create change stream.
As @Bholoubi mentioned about step 4 insert data into table which we had created change stream on. If dataflow job is running, it would detect change stream occured, and process it , then write to bigquery.
you can find out stages from job graph.
**refer the photo below**
Thank you. @fuchengyen
step 3 using command line instead of web ui.
gcloud config set compute/region [dynamically selected lab startup]
gcloud dataflow flex-template run streamjob --template-file-gcs-location \
gs://dataflow-templates-[dynamically selected lab startup]/latest/flex/Spanner_Change_Streams_to_BigQuery \
--region [dynamically selected lab startup] \
--parameters spannerInstanceId=orders,spannerDatabase=orders-db,spannerMetadataInstanceId=[Cloud Spanner Meta Instance],spannerMetadataDatabase=[Cloud Spanner Meta Database],spannerChangeStreamName=ordersstream,bigQueryDataset=[BigQuery Dataset]
Hurray! I can go back and attempt the other challenges!
I just finished challenge 4
Seems to be clone 🤖 of
Can we get a new Challenge instead of repeated ones 😅
What challenge do you mean GoogleClout Set 6 (5/10) I don't see the Link???
User | Count |
---|---|
34 | |
18 | |
5 | |
2 | |
1 |