I'm using Google Provided Dataflow Template -> 'goog-dataflow-provided-template-namegcs_parquet_to_cloud_bigtable' for migrating a Parquet file in GCS to BigTable. But I'm getting the below error. Job failed in 'Read From Parquet' Stage.
Error from Dataflow,
Workflow failed. Causes: Error: Message: Invalid value for field 'resource.properties.networkInterfaces[0].subnetwork': ''. Network interface must specify a subnet if the network resource is in custom subnet mode. HTTP Code: 400
Solved! Go to Solution.
It seems you missed to give subnet name or given in wrong format as you are using custom vpc and not default one.
Can you paste screenshot of subnet field in dataflow job?
I had to give the subnetwork in the DataFlow Job explicitly to resolve this error.
"https://www.googleapis.com/compute/v1/projects/my-project/regions/asia-south1/subnetworks/default"
It seems you missed to give subnet name or given in wrong format as you are using custom vpc and not default one.
Can you paste screenshot of subnet field in dataflow job?
I had to give the subnetwork in the DataFlow Job explicitly to resolve this error.
"https://www.googleapis.com/compute/v1/projects/my-project/regions/asia-south1/subnetworks/default"