Hello Community,
We need in our project, from a Dataform Workflow to trigger a Cloud Run Workflow via an end point.
For this we decided to use a BigQuery Remote Function and call it from our Dataform Workflow.
The BQ remote function looks like this:
CREATE FUNCTION PROJECT_ID.DATASET_ID.remote_add(x INT64, y INT64) RETURNS INT64
REMOTE WITH CONNECTION PROJECT_ID.LOCATION.CONNECTION_NAME
OPTIONS (
endpoint = 'ENDPOINT_URL'
)
DATASET_ID: the ID of your BigQuery dataset.
ENDPOINT_URL: the URL of your Cloud Function or Cloud Run remote function endpoint.
The question would be, can we create the BQ remote function directly from Dataform and not doing it manually from BQ user interface?
Thank you and Best regards,
Valentin
I think you can implement this using "Operations" in Dataform, but I do not understand why would you want to create a function in Dataform.
We are also working with Remote functions but we maintained the remote functions via Terraform and interact with them via Dataform.
Hello @mongaa,
Thank you for the response. Can you please provide an example in Terraform where you create the remote function? Like I mentioned above we need a remote function that triggers a cloud run end point.
Thank you!
Our terraform repository maintains Multiple projects, and inside our dataform project we have a subfolder for functions, inside this functions folder we maintain different functions in a specific folder where maintain our python files and requirement file.
Note : Please check how many parallel calls you can do to a remote function from dataform.
For us this was a problem since we are running out entire data warehouse on dataform and had to make multiple parallel calls to remote function which was initially a problem but later after having multiple sessions with Google, They have raised this number.