Influence Looker’s Roadmap and join the BI Customer Council (BICC)
Would you like to influence Google Cloud’s BI roadmap more? I’d like to invite you to join the Google Cloud BI...
•
Would you like to influence Google Cloud’s BI roadmap more? I’d like to invite you to join the Google Cloud BI...
I have a table with a daily metric, and now I want to add aggregate awareness to have weekly metrics handy. Be...
The following setting has to be turned off in GitLab repo in order to enable Dataform’s workspaces git integra...
Hi Dataform Community,I'm encountering issues dynamically passing variables (specifically dataset names) from ...
Hello,I'm just thinking how to correctly present data I got from this BQ extract. Especially I mean table publ...
Hello @ms4446 I have installed Dataform on my local machine(WSL) and ran the dataform deployments to GCP, ever...
Hello,I have created a datastream from MySql to BigQuery, a made table in Bigquery partitioned.It looks like i...
Hi Team, I want to push Data from Big Query to Cloud SQL in batch mode.Please suggest me the simplest way to d...
@pgstorm148 I am using composer-2 [ composer-2.8.5-airflow-2.7.3 ] version in which i am running min= 2 schedu...
Trying to migrate data from Big Query to S3 bucket using AWS glue but not able to configure the connection pro...
Hi everyone,I'm new to Dataform technology and I need your support to understand if I'm doing something wrong....
Hi Community folks,I've got some complex logic in definitions/javascript file that iterates through a list of ...
I have two Query .1. How can add multiple YouTube channel data at once Loocker Studio?2. How can I add YouTube...
Hi there,I am looking for some advice here. I thought of the following solutions for exporting GA4 raw data fr...
Hi Everyone, I am using data fusion to extract data which is loaded into BigQuery. From the source table, the ...
Hello GCP community,I am using BigQuery on-demand pricing, and I want to find details about jobs and their cos...
To make the question simple. I create a dataset call test_dataset. And I can see it inside Bigquery after I cr...
I am running a PySpark streaming job on Google Cloud Dataproc to read data from Pub/Sub and write it to BigQue...
Issue: I'm getting an execution failure near the end of my scheduled dataform workflow from time to time and I...
Hello, I'm migrating some python workloads from Data Catalog to Dataplex catalog and I'm struggling with autom...
Hi there,We are currently evaluating how to migrate our analytics setup from Scheduled Queries to a Dataform r...
I have created a calculated field and a parameter in looker studio, to filter my data based on options 12 mont...
Hi,I am using the Multitable Database Table plugin to load tables from Oracle to BigQuery. The pipeline works ...
I have noticed from few days that I am not getting to UPGRADE Composer 2 image version. During my finding i fo...
I am currently developing a function to retrieve data from a BigQuery table on a local server set up using Go....
HiI'm having trouble importing configurations into my sqlx. I created the dataform_cofig.json file {"database"...
Hi TeamI am planning to use key inputs feature in looker studio pro. I want to store this value in some place....
Dear TeamAny inputs on the below would be highly appreciated? we have set the below things in our scanscope in...
Hello.I have a case where I need to filter data using a view and a partition column.Table:CREATE TABLE `testin...
User | Likes Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |
Mirror mirror on the wall, What do I do with my API sprawl, Where can I see them all, How to make sure they listen to my call, And to ensure their compliance doesn’t fall. API Hub is the one to rule them all!
We are thrilled to announce that Google is the first hyperscaler to provide selected customers with access to Apache Airflow 3 on our fully managed Cloud Composer 3 service. Discover how powerful new features like DAG Versioning, a modern React-based UI, and scheduler-managed backfills can help you innovate faster and more reliably.
You can manage the order of automatic cluster upgrades across Google Kubernetes Engine (GKE) clusters in multiple environments using rollout sequencing. For example, you can qualify a new version in pre-production clusters before upgrading production clusters.