I have a few files that are uploaded to my GCS bucket. I need to copy those files using Python to another bucket after encrypting them with CSEK. The copy_blob method does not accept encryption keys. I do not want to open the files so upload_from_filename is useless. My ask is that as soon as the files are uploaded, it should get encrypted with the CSEK and get copied to a different bucket.
Thanks!
You can use the following to upload your encrypted file to a bucket:
import base64
from google.cloud import storage
def upload_encrypted_blob(
bucket_name,
source_file_name,
destination_blob_name,
base64_encryption_key,
):
"""Uploads a file to a Google Cloud Storage bucket using a custom
encryption key.
The file will be encrypted by Google Cloud Storage and only
retrievable using the provided encryption key.
"""
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
# Encryption key must be an AES256 key represented as a bytestring with
# 32 bytes. Since it's passed in as a base64 encoded string, it needs
# to be decoded.
encryption_key = base64.b64decode(base64_encryption_key)
blob = bucket.blob(
destination_blob_name, encryption_key=encryption_key
)
blob.upload_from_filename(source_file_name)
print(
f"File {source_file_name} uploaded to {destination_blob_name}."
)
Hi Andre,
As mentioned in the post, my files are already in GCS and have large volumes(in GBs). The method upload_from_filename does not take GCS path as source file name. My requirement is to copy the files without opening them into a different bucket with CSEK