I'm calling workflows which calls cloud run function using cloud scheduler and trying to pass json array in the body of the scheduler manually so it should automatically picked up by workflows.
When I give the below input in the cloud scheduler following output is getting generated in the workflows and throwing out an error.
Input to the cloud scheduler:
{
"argument": "[
{
\"cloud_function_url\": \"url\",
\"dbt_service_url\": \"service_url\",
\"export_query\": \"select * from p_1.d_1.t_1_daily_view LIMIT 1000000000\",
\"upload_bucket\": \"bucket_name\",
},
{
\"cloud_function_url\": \"url\",
\"dbt_service_url\": \"service_url\",
\"export_query\": \"select * from dde-res-people-d001.ast_forsta.candidate_survey_view LIMIT 1000000000\",
\"upload_bucket\": \"bucket_name\",
}
]",
"callLogLevel": "LOG_ALL_CALLS"
}
Output to the workflows:
[
{
"cloud_function_url": "cloud_function",
"dbt_service_url": "url",
"export_query": "select * from p_1.d_1.t_2 LIMIT 1000000000",
"upload_bucket": "bucket_name"
},
{
"cloud_function_url": "cloud_function",
"dbt_service_url": "url",
"export_query": "select * from p_2.d_2.t_2 LIMIT 1000000000",
"upload_bucket": "bucket_name"
}
]
And throwing an error with the following:
TypeError: in lookup, expecting dictionary, got: array
in step "initialize_workflow", routine "main", line: 6
{
"message": "TypeError: in lookup, expecting dictionary, got: array",
"tags": [
"TypeError"
]
}
Question is can we use json array in the body of the cloud scheduler to be input to workflows when calling and i tried using json.encode and decode in the workflows yaml code the same above error gives out in a different way. It would be "TypeError: in lookup, expecting array, got: dictionary"
Any idea how to achieve ??
Hi @k_pavan98,
The problem arises because the argument field provided by Cloud Scheduler is either misread as a string or is not structured as a valid JSON array, which is necessary for Workflows to handle it properly.
You may try this recommendations:
1.Pass a Proper JSON Object:
Modify the Cloud Scheduler payload to ensure the argument is a JSON object rather than an array or a raw string:
{
"argument": {
"data": [
{
"cloud_function_url": "url",
"dbt_service_url": "service_url",
"export_query": "select * from p_1.d_1.t_1_daily_view LIMIT 1000000000",
"upload_bucket": "bucket_name"
},
{
"cloud_function_url": "url",
"dbt_service_url": "service_url",
"export_query": "select * from dde-res-people-d001.ast_forsta.candidate_survey_view LIMIT 1000000000",
"upload_bucket": "bucket_name"
}
],
"callLogLevel": "LOG_ALL_CALLS"
}
}
2. Update Workflow YAML Code In the Workflow, decode the argument appropriately:
main:
params: [argument]
steps:
- initialize_workflow:
assign:
- input_data: ${json.decode(argument.data)}
- process_step:
call: some_other_function
args:
data: ${input_data}
3. Debugging and Validation:
* Make certain that quotes in the payload transmitted from Cloud Scheduler are properly escaped.
* Verify the adjusted payload and YAML Workflow to ensure the problem is fixed.
4. Option with Base64 Encoding: If problems with JSON serialization continue, encode the payload in Base64 using Cloud Scheduler and decode it in Workflows.
By trying suggestion/s above, this may resolve your concern. Hoping this would be helpful!