i have some dataproc batch (serverless) jobs which read data from gcs and write to bigquery. In my job i am capturing some counts using df.count() and then finally writing this count as json logs using from google.cloud import logging , logging.log_struct({"count":df.count()}) . I can see the count in a json format in the cloud logging but when i am trying to create a metric from this log i am only seeing counter or distribution type metric.
I want to create a dashboard where it just plots the count value for each run . Is there any other way to achive this ? or i need to go with creation of distribution type metric and then creating a dashbaord from this metric in Monitoring ?