Automating Data Transfer from source to Destination Table Based on Project Id

  1. I have a Source Table (read-only) with dynamic data, including columns for Category (ENUM) and Business Unit (ENUM_LIST).
  2. In my Project Table, I create new projects, each associated with a specific Category and Business Unit.
  3. I need to transfer data from the Source Table to a Destination Table based on the Project. 

The Project ID column in the Source Table, referencing the Project Table.

The goal is to automatically transfer relevant data from the Source Table to the Destination Table when a new project is created in the Project Table. Since the Source Table can contain a large number of entries for a given Category and Business Unit, I need to perform this data transfer in batches.

My question: What would be the most efficient row filter condition in my bot event to achieve this? Specifically, I need help with configuring the bot to: Trigger when a new project is created in the Project Table. Identify the corresponding records in the Source Table based on the Project ID. Transfer those records to the Destination Table in batches, given the potentially large number of records in the Source Table.

I'm specifically looking for a solution using Appsheet Automation features.

1 3 85
3 REPLIES 3


@Keshava_p wrote:

The goal is to automatically transfer relevant data from the Source Table to the Destination Table when a new project is created in the Project Table. Since the Source Table can contain a large number of entries


I'd hate for you to go down a path that maybe is not required.  AppSheet can handle a respectable large number of rows efficiently and even more so when it is read only.  It would be most efficient if the app could populate the sourced data as each Project is created

How large is this table - rows and columns?
Are ALL rows considered "active" or are there inactive/no longer needed records that can be filtered out?  About how many rows are considered active?

To provide some context, I have an app that is nearing 100,000 rows in one table with at least a dozen other tables in the tens of thousands of rows.  It runs on Google sheets and is still performing exceptionally well for Sync's and overall performance.

 

17 columns with 120,000 rows in Source table(Big Query) , it may increase. Each and every row will be considered as active because every month data will regenerate and updated. so, before refreshing the data in source, end user needs to give inputs for each and every row. 

You might find this video helpful

It shows how to move data from one table to another, both client side (using actions) and server side (using automation).

Hope it helps!  Happy apping!

Top Labels in this Space