Have been trying to build a pipeline using several generic examples. Python created the following yaml file, but when I upload it into vertex ai pipeline page, I always get the following error "Invalid File Content".
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: tfrecord-processing-pipeline-
annotations: {pipelines.kubeflow.org/kfp_sdk_version: 1.8.22, pipelines.kubeflow.org/pipeline_compilation_time: '2023-05-23T21:47:19.096627',
pipelines.kubeflow.org/pipeline_spec: '{"inputs": [{"default": "gs://aa.bb.cc.dd/data",
"name": "tfrecord_dir", "optional": true, "type": "String"}], "name": "TFRecord
Processing Pipeline"}'}
labels: {pipelines.kubeflow.org/kfp_sdk_version: 1.8.22}
spec:
entrypoint: tfrecord-processing-pipeline
templates:
- name: load-tfrecord-dataset
container:
args:
- "\n import tensorflow as tf\n file_pattern = \"{{inputs.parameters.tfrecord_dir}}\"\
\ + '/*.block'\n files = tf.io.gfile.glob(file_pattern)\n \
\ for file in files:\n dataset = tf.data.TFRecordDataset(file)\n\
\ "
command: [python, -c]
image: tensorflow/tensorflow:2.6.0
inputs:
parameters:
- {name: tfrecord_dir}
metadata:
labels:
pipelines.kubeflow.org/kfp_sdk_version: 1.8.22
pipelines.kubeflow.org/pipeline-sdk-type: kfp
pipelines.kubeflow.org/enable_caching: "true"
- name: tfrecord-processing-pipeline
inputs:
parameters:
- {name: tfrecord_dir}
dag:
tasks:
- name: load-tfrecord-dataset
template: load-tfrecord-dataset
arguments:
parameters:
- {name: tfrecord_dir, value: '{{inputs.parameters.tfrecord_dir}}'}
arguments:
parameters:
- {name: tfrecord_dir, value: 'gs://aa.bb.cc.dd/data'}
serviceAccountName: pipeline-runner
Any thoughts?
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |