Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

drop column from large bigquery table protected by cmek

We are onboarding our data warehouse to BigQuery according to company direction. We facing a problem which blocks our onboarding: Cannot DROP COLUMN from tables protected by customer-managed encryption keys. I know this drop function is not supported by BQ for CMEK columns. but per our company's policy, we must use CMEK columns. I want to know the best practice of dropping a column under such restriction. Some more requirements about our use case:

  1. the table is large which have >100M new records per day.
  2. data writing into table is very frequent, as micro batch, insert data every 30s

Available solution I'm aware so far:

  1. CREATE OR REPLACE TABLE temp AS SELECT * EXCEPT (column_name). I understand this way actually copy almost all the data and replace original table. Considering large data volume of the table, will it block the table from reading and writing new data for long duration?
  2. Never drop a column in table but create a view in front of the table controlling visible columns. we can insert default null value for columns need be dropped in upstream side but keep in BQ table. considering column storage of BQ table, can I assume a column of all null value have very low storage cost? On cons I can see for this approach is in long term we may meet column number limitation.

Besides above approaches, is there any better solution?

0 0 1,001
0 REPLIES 0