This website uses Cookies. Click Accept to agree to our website's cookie use as described in our Privacy Policy. Click Preferences to customize your cookie settings.
I am trying to ingest data from GCS to Big Query. I can do it when I run
it locally but when I put my function inside a vertex AI component and
run it using a pipeline I get the above error. Moreover, the project it
states is not even my project id. ...
Is there a way that I can add custom statistics to my Data profile scan
according to my own needs? Or is there a way I can add custom SQL
queries that provide me with those custom stats in the Data Profiling
scan itself?
I created a basic pipeline run using managed notebooks as well as
instances in workbench. But my basic pipeline couldn't even run with
error quoting - The DAG failed because some tasks failed. The failed
tasks are: [concat].; Job (project_id = practi...
Yes mary99 it's still not possible to run in free trial from quotas u
have gotten as previously the 1 quota u highlighted was 8 for the free
trials too.
But when I was running it out component of Vertex AI pipelines, it was
running even without specifying the project context. That's what made it
difficult to point out. As it shouldn't have run in both the cases.
That's what I feel.
Hey kolban, the problem got resolved just be using
bigquery.Client(project=project) Earlier I didn't specify
(project=project ) But it was working when not using vertex ai. So I
guess we need to specify project to big query when using vertex ai or
so...
Hey, I am facing this error Invalid table-valued function EXTERNAL_QUERY
Failed to connect to MySQL database. Error: MysqlErrorCode(2059):
Authentication plugin 'mysql_clear_password' cannot be loaded:
/usr/lib/plugin/mysql_clear_password.so: cannot ...