Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Assigning a BQ table Entry in data catalog to a custom-made EntryGroup

Hi,

We are trying to create custom EntryGroups in GPC data catalog, and then populate them with different GCP and non-GCP assets.

I used the Python client API to create a custom EntryGroup, and a custom Entry representing a Kakfa topic, and everything worked well

My issue is for doing the same for an already existing BQ table in our GCP project. From the UI and the documentation I can see that every BQ table automatically has a data catalog Entry, which is assigned to the `BigQuery` EntryGroup.

Now, my question is, is there anyway for me to use the existing Entry for the BQ table and assign it to my custom-made EntryGroup?

So far, I did not find that this is possible via the API. This causes two issues for us:

  1. We have to create a custom-made Entry for every BQ table in our project. This really sounds unnecessary, and doesn't bring any integration value of using data Catalog while we are already on GCP.
  2. When I created the custom Entry, even when using the `linked_resource` and directing it to the BQ table, I was expecting that the Entry would get enriched by all the metadata that is available regarding this BQ table, such as schema etc. But this did not happen, which again, makes me wonder about integration level.

Does anyone have any tips or workarounds to address this issue I am facing? Any help would be appreciated!

 

EDIT: I found out using the lookup method, I can get the entry which represents the BQ table, but I still struggle to assign the Entry to a custom-made EntryGroup.

Solved Solved
0 1 301
1 ACCEPTED SOLUTION

Hi @blueish_27,

Welcome to Google Cloud Community!

Data Catalog doesn't let you directly move those existing BigQuery Entries to different EntryGroups. Google automatically manages those entries.

Here are some workarounds that you can do:

  • Use Tags (Best for Organization): Tag your custom EntryGroup and the relevant BigQuery table Entries with the same keywords.
  • Manually Mirror Metadata (More Work): 

    • Create your own Entry in your custom EntryGroup. Copy the BigQuery table's information into it (like its schema). You'll need to keep this copy updated yourself if the BigQuery table changes.

Google Cloud Data Catalog doesn't offer automatic metadata syncing for custom entries or moving BigQuery entries between EntryGroups.

I hope the above information is helpful.

 

View solution in original post

1 REPLY 1

Hi @blueish_27,

Welcome to Google Cloud Community!

Data Catalog doesn't let you directly move those existing BigQuery Entries to different EntryGroups. Google automatically manages those entries.

Here are some workarounds that you can do:

  • Use Tags (Best for Organization): Tag your custom EntryGroup and the relevant BigQuery table Entries with the same keywords.
  • Manually Mirror Metadata (More Work): 

    • Create your own Entry in your custom EntryGroup. Copy the BigQuery table's information into it (like its schema). You'll need to keep this copy updated yourself if the BigQuery table changes.

Google Cloud Data Catalog doesn't offer automatic metadata syncing for custom entries or moving BigQuery entries between EntryGroups.

I hope the above information is helpful.