I'm new to data engineering and using MongoDB, so I'd like some help, please.
I have a MongoDB database that needs to be migrated to BigQuery. The goal is to:
I initially implemented the Dataflow batch template MongoDB to BigQuery template, but it is WRITE_APPEND and this caused the target tables to have millions of repeated records.
Later I tried to use the MongoDB to BigQuery template (Stream), but it gave an error:
Failed to read the result file : gs://dataflow-staging-southamerica-east1-287781428749/staging/template_launches/2025-04-22_06_30_53-16695916874292364341/operation_result with error message: Calling GetObjectMetadata with file "/bigstore/dataflow-staging-southamerica-east1-287781428749/staging/template_launches/2025-04-22_06_30_53-16695916874292364341/operation_result/": cloud.bigstore.ResponseCode.ErrorCode::OBJECT_NOT_FOUND: No such object: dataflow-staging-southamerica-east1-287781428749/staging/template_launches/2025-04-22_06_30_53-16695916874292364341/operation_result/ [google.rpc.error_details_ext] { message: "No such object: dataflow-staging-southamerica-east1-287781428749/staging/template_launches/2025-04-22_06_30_53-16695916874292364341/operation_result/" details { [type.googleapis.com/google.rpc.DebugInfo] { stack_entries: "com.google.net.rpc3.client.RpcClientException: APPLICATION_ERROR;cloud.bigstore (...)
Questions:
Any help is welcome, as I have been looking for solutions for a few weeks now.
Thank you.