Hello,
I try to use the GCS connector in a POC. What I try to achieve is to detect the creation of a file in a bucket and move this file to another bucket after some processing. I created a GCS connector and try to use the Move Object action. In the interface the input parameters is connectorInputPayload (defined as JSON) but I have no clue about the format of this json.
I looked in the documentation https://cloud.google.com/integration-connectors/docs/connectors/cloudstorage/configure but there is even no Move action described.
Where could a find some details about the input parameters?
Regards
Solved! Go to Solution.
FolderPath is not mandatory and it can be skipped.Below is the samplepayload and HasBytes is marked as true to download content in Base64 encoded format
{
"Bucket": "bucket-test-01",
"ObjectFilePath": "image01.png",
"HasBytes" : true
}
connectoroutputpayload -> ContentBytes will contain content in base 64 encoded format which can be passed as input to FTP connector input payload
We are working on documentation enhancements with examples and will be published soon for GCS
My bad, the information is in the parameter details. The schema is given.
Great that you found it! We'll make sure this also gets added to the docs. Thank you for your feedback.
Btw: for your use case, you can use a feature that GCS has to notify PubSub when a new object is written to a bucket, then you can use a PubSub trigger in Integration to take the actions you want to take like moving the object to a different bucket.
Hello,
Thanks for the feedback. I will use the PubSub notification to run my integration.
In my final use case, I would like to be able to send the file from bucket to a ftp server. I now struggle to find how to get the content of the file from bucket in order to give it as input to the ftp connector. Would you have an example?
To achieve the use case, Download Object action in GCS connector can be used to download the file content and can use FTP connector- Upload to upload content of file
SFTP connector document has examples which could help in this regard
https://cloud.google.com/integration-connectors/docs/connectors/sftp/configure
Thanks for your feedback.
I tried to use the GCS download action but was puzzled by the fact that the input json requires a "FolderPath" parameter (The path, or URI, to the file that will receive the data of the object.). What should I put as I don't have a specific path to write to.
The second point is that if I have to transfert a binary file (a zip for example) the sftp connector requires me to put the encoded content in the input json. Here again I don't see how I get the binary content from the bucket object.
I hope I didn't again miss an obvious hint.
Regards
FolderPath is not mandatory and it can be skipped.Below is the samplepayload and HasBytes is marked as true to download content in Base64 encoded format
{
"Bucket": "bucket-test-01",
"ObjectFilePath": "image01.png",
"HasBytes" : true
}
connectoroutputpayload -> ContentBytes will contain content in base 64 encoded format which can be passed as input to FTP connector input payload
We are working on documentation enhancements with examples and will be published soon for GCS
Thanks for your hints and guidance. My little POC now works like expected. I'm looking forward to see your documentation updates.
I saw in the FTP connector that there is an option called "SSL Mode" that can currently only contain the "None" option. Is this a future enhancement to support FTPs protocol?
Thanks again for your help.
Regard
Philippe
Thank you for your valuable feedback. It helped us with documentation enhancements which is in progress.
SSL support for FTP is in roadmap
Sharing the documentation link for GCS connector with enhancements
https://cloud.google.com/integration-connectors/docs/connectors/cloudstorage/configure
Hello,
Thanks for sharing.
Just a notice, in the documentation it tells to select "Cloud Storage" connector but the displayed value is GCS
It would be nice to rename the option in the interface or to add the information in the help.
Second point concerning authentication. Once you have selected a service account, there is no authentication options to select.
The section that gives details about the action is nice and very detailed. Do you plan to also give details about the entities options?
The current documentation is clearly a good improvment from my point of view.
Regards
Philippe
Thanks for the feedback on naming and we will work on aligning it.
As its service account based authentication, it does not require additional selection in Authentication section.
For entity operations, below link which is also referred in Cloud storage connector document entities section would be helpful. Listing entities options is not in the current plan.
https://cloud.google.com/application-integration/docs/connectors-task#config-prop
@Madhuvandhini wrote:As its service account based authentication, it does not require additional selection in Authentication section.
Yes, I just wanted to express that the documentation is somehow misleading in the Authentication section as the exposed parameters don't exist for the GCS connector (or I missed an option).
Regards
Sure. We will take that feedback for documentation.