Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Uploading files from SAP HANA to GCP Cloud Storage

Hello community,

I have again a questions about SAP <> GCP. 

I want to be able to generate endpoint that will be used for uploading files into google cloud storage bucket. That endpoint should be active for ~6 months. The team uploading the files doesn't and won't have access to the GCP.

Files can be larger than 100MB - I've tried using cloud run service for the upload but as we all know there's a limit of 32MB on Cloud Run. I don't understand why I must specify object name when trying to generate signed url for the upload of the data? Also, I know of signed urls but I don't think they're good fit here since their validity goes up to 7 days which (if you agree) is not really a good option since I would need to regenerate and resend the url each month at least once.

So, please share some good options that I can build on my side so that the client side is applying the least changes possible.

Thanks, Zeljana.

Solved Solved
0 2 340
1 ACCEPTED SOLUTION

Hi @zeljana,

Welcome to Google Cloud Community!

To answer your questions, when you create a signed URL to upload a file to Google Cloud Storage, you have to include the name of the object you're uploading. That’s just how GCS works—it’s an object storage system, so every file needs a unique name in the bucket. It might feel like you're just picking a filename, but you're actually setting the key that identifies the file in storage. This helps GCS keep things organized and makes it easy to find and manage your files later.

Given what you need, the best way to go is to set up a secure proxy service on Google Cloud. This would sit between your SAP system and Google Cloud Storage, giving your SAP team a reliable place to send files—without them needing direct access or credentials for GCP.

  • Build a Proxy Service: You’d create a custom app running on something like Google Compute Engine or maybe Cloud Run as long as it's set up to handle larger uploads—though Cloud Run might need some changes on the SAP side. This app will expose a secure HTTPS endpoint that SAP can send files to.
  • Handle Authentication: The proxy will take care of verifying who’s allowed in. You could use something like API keys managed by API Gateway, or go a step further with client certificates for mutual TLS, so only trusted SAP systems can connect.
  • Use a GCP Service Account for GCS Access: The proxy itself would use a service account with permission to upload files to your Google Cloud Storage bucket—specifically, it just needs the storage.objectCreator role.
  • Support Big File Uploads: For files over 100MB, you’ll want to use GCS’s resumable uploads. That way, files go up in chunks and can be resumed if the connection drops partway through.
  • Keep the Endpoint Stable: This endpoint will stay live for as long as you need—say, 6 months—so the SAP side doesn’t need to constantly update or regenerate anything.
  • Manage File Names: You can either let SAP include a filename when sending the file, or have the proxy auto-generate one.

By going with this kind of proxy setup, you’re hiding all the tricky GCS stuff from SAP and giving them a simple, secure way to send files. It’s flexible, secure, and designed to last without needing a lot of maintenance.

Was this helpful? If so, please accept this answer as “Solution”. If you need additional assistance, reply here within 2 business days and I’ll be happy to help.

View solution in original post

2 REPLIES 2

Hi @zeljana,

Welcome to Google Cloud Community!

To answer your questions, when you create a signed URL to upload a file to Google Cloud Storage, you have to include the name of the object you're uploading. That’s just how GCS works—it’s an object storage system, so every file needs a unique name in the bucket. It might feel like you're just picking a filename, but you're actually setting the key that identifies the file in storage. This helps GCS keep things organized and makes it easy to find and manage your files later.

Given what you need, the best way to go is to set up a secure proxy service on Google Cloud. This would sit between your SAP system and Google Cloud Storage, giving your SAP team a reliable place to send files—without them needing direct access or credentials for GCP.

  • Build a Proxy Service: You’d create a custom app running on something like Google Compute Engine or maybe Cloud Run as long as it's set up to handle larger uploads—though Cloud Run might need some changes on the SAP side. This app will expose a secure HTTPS endpoint that SAP can send files to.
  • Handle Authentication: The proxy will take care of verifying who’s allowed in. You could use something like API keys managed by API Gateway, or go a step further with client certificates for mutual TLS, so only trusted SAP systems can connect.
  • Use a GCP Service Account for GCS Access: The proxy itself would use a service account with permission to upload files to your Google Cloud Storage bucket—specifically, it just needs the storage.objectCreator role.
  • Support Big File Uploads: For files over 100MB, you’ll want to use GCS’s resumable uploads. That way, files go up in chunks and can be resumed if the connection drops partway through.
  • Keep the Endpoint Stable: This endpoint will stay live for as long as you need—say, 6 months—so the SAP side doesn’t need to constantly update or regenerate anything.
  • Manage File Names: You can either let SAP include a filename when sending the file, or have the proxy auto-generate one.

By going with this kind of proxy setup, you’re hiding all the tricky GCS stuff from SAP and giving them a simple, secure way to send files. It’s flexible, secure, and designed to last without needing a lot of maintenance.

Was this helpful? If so, please accept this answer as “Solution”. If you need additional assistance, reply here within 2 business days and I’ll be happy to help.

Hi,

Thank you Joy. I will try this out and hopefully it will work out for my use case.