Most efficient way to move rows from preview table to permanent table

Summary

One of my app's functions involves regularly moving 1000s of rows from one table to another (copy from table A to table B and then delete from table A). While I wouldn't be surprised to learn that that design for my use case is suboptimal to begin with, it's the best I've come up with (if interested, details in the following full-story explanation). However, the way I've implemented it takes a really long time for the user--each individual row addition or deletion seems to be an individual change that needs to be synced from the local app to the server. Is there a more efficient way to move that data? Maybe just change one value in the app to trigger an automation to perform all the data moving server-side--or would that result in the same sync processing time but just in the other direction? Maybe this is what the AppSheet API is for?

Each table's data source is a worksheet in the same Google Sheets file. My action implementation is essentially:

  • Grouped: execute a sequence of actions - from a parent table, executes the following actions
    1. Data: execute an action on a set of rows - executes the following on the child table 
      • Data: add a new row to another table by using values from this row - copies rows from Table A to Table B
    2. Data: execute an action on a set of rows - executes the following on the child table
      • Data: delete this row - deletes rows in Table A

As always, I appreciate any guidance.

Full story

Here's the context of my use case in case anyone wants to dig in and advise how I perhaps should be approaching this differently.

One of my app's views includes a function invoked from a parent table that enables importing a .csv file to add records to a child table. It's somewhat complex under the hood (leveraging nifty techniques explained by @TeeSee1, @Kirk_Masden@MultiTech, and others) and, unfortunately, the user has to perform the import in 2 steps:

  1. A "prepare for import" action that sets up data so the parent ID can be populated in the child records via the initial value of the child table's ref column to the parent ID
  2. An "import" action that imports the .csv and does some other data clean-up.

Nonetheless, the system-processing aspects have been efficient from a user's perspective--i.e., the import would process very quickly.

However, as my app has begun to be used more, I've now observed that there are some points of fragility in my approach. Multiple users have made errors in the upload file resulting in unintentionally blank columns. Also, it occurred to me that by importing directly to the table that's the final destination for the data all the columns that are not intended to be populated via .csv import are vulnerable to intentional or inadvertent corruption if they happen to be included in the .csv file. It would be great to be able to manage this via properties that provide More control over csv import action, but that's not available.

To address these issues, I implemented a technique that @TeeSee1 recommended to begin with, which is to import the .csv to a preview table before then copying the data to the permanent table. That enables me to accomplish the following:

  • limit the import to only the columns I want to be importable
  • easily display the imported data as read-only for the user to preview and catch errors like empty columns before then choosing one of the following actions:
    • Confirm import - this is where I copy rows from the preview table to the permanent table, and then delete the rows from the preview table
    • Discard import - this just deletes the rows from the preview table

Problematically though, this revamped approach has increased by many orders of magnitude the app's processing time for the import.

Solved Solved
2 4 1,080
1 ACCEPTED SOLUTION

4 REPLIES 4
Top Labels in this Space