Bulk Update Issue for Weighted Score Calculation in AppSheet

I have a table with around 1 million rows, and I need to calculate and update the weighted score based on project category creation. For a single project, the maximum number of rows will be 100,000.

I have three tables involved:

1. User Table โ€“ End users provide ratings for each row based on material.

2. Total Impact Table โ€“ Admin calculates the total impact for each relevant column.

3. Weighted Score Table โ€“ This table contains user input columns and total impact columns. The weighted score is calculated based on these two inputs.

For example, I am using the following formula in the Weighted Score column for Supplier Spend:

IF([SUPPLIER_SPEND] = "error", "",IF(ISNOTBLANK([SUPPLIER SPEND]),SWITCH(TRUE,[SUPPLIER SPEND] <= 5, 1,[SUPPLIER SPEND] <= 10, 2, [SUPPLIER SPEND] <= 15, 3,  [SUPPLIER SPEND] <= 20, 4,  5 ) * [101_IMPACT SUPPLIER SPEND],"n/a" ))

The Issue:

The values in the Weighted Score column only update when I open a row and click the edit button. However, I need to apply this calculation to all 100,000 rows at once whenever there is a relevant change in data.

What I Have Tried:

I tried using App Formula & actions

*I don't want to use a virtual column for this because it impacts performance significantly.

My Question:

Is there a way in AppSheet to update all rows at once without requiring manual edits? Any suggestions or best practices would be greatly appreciated!

0 3 111
3 REPLIES 3

Due to the amount of data you imply there is - up to 100,000 rows - there isn't anything on the AppSheet side that will re-compute all these rows all at once efficiently enough.  You would either experience a non-responsive UI or automation timeouts.

So, you need to find an alternative approach.

If you are using a database, you could use some form of db processing or associated tool to perform the re-calcs in the database.  Then rely on Syncs to bring back the new values.

Similarly, you could use some off platform tools, such as scripts, to recalc the values.

Another option is re-calc as needed.  I don't know your app but I would assume the views are designed to show smaller sets of rows.  The app could use a custom navigation Group action that re-calcs the rows about to be displayed then navigate to the view.

A drastic solution would be a data structure re-design.  It's possible that maybe you don't have the best structure of data in place to accommodate the apps re-calculation needs.  This can happen as an app is expanded and new features are requested.

I hope this helps!

Very well said! What a great answer. 

But you don't think you could use automation for this? ๐Ÿค” 

  • I think you could 
  • Though depending on the situation it might be tricky to get it to go through. 

Take your formulas and put them inside an automation data change task, and try and run it through there.  Whenever I need to do Mass updates, automation is one way to go (because everything runs on the server in parallel). 

I have a video where I do something kind of like this, in the video scenario I'm moving values from one table to another, but the high level situation is kind of the same - I'm using automation to run an update on a bunch of records. 

https://youtu.be/OY6UWen2XlM

Hope it helps!

As I was working, I just came across an implementation doing this, thought I would show:

Scenario:

  • We need to run a weekly processing update on each active patient in the system
  • This update involves:
    • Triggering a complex ai processing workflow (handled externally)
    • Then updating a QuickChart.io graph for some data points

-------------------------------------------------------------------------------------------------------------------------------------

  1. We've got a slice that holds all the patients that need to be updated
    MultiTech_0-1742484962308.png
  2. We created a reusable process for what we need to do
    MultiTech_1-1742485033680.png
  3. We then created 4 automations (one at each 15 minute mark) that run this process (if there are things to run)
    MultiTech_2-1742485091043.png
    MultiTech_3-1742485126418.png

     

  4. This then runs a couple of internal API calls
    1. One to send off the next 40 patients for a chain run
      • It does this by creating a record in another table (weekly analysis)
        MultiTech_4-1742485232448.png
        MultiTech_5-1742485277511.png
      • This takes the top 40 records and creates a record in the weekly analysis table for them
      • This then kicks off an automation over there, to do whatever needs to happen for that patient
        MultiTech_9-1742485808972.pngMultiTech_10-1742485867559.png
      • This way everything is running in parallel ๐Ÿ˜‰ 
        • More info about the Ai processing platform I built coming soon! ๐Ÿ‘€

    2. Another to update the graph for the next 40 patients
      MultiTech_6-1742485393550.png
      MultiTech_7-1742485445900.png

       

      • You can see here I'm using a manual automation trigger to run a parallel operation of all 40 of the next patients
        MultiTech_8-1742485509997.png

         

      • This clears the trigger;
      • Runs a script to build out the quickchart.io code we need (it's a really complex chart we're building);
      • Then updates the patient record with the new value

We chose to limit things to 40 for stability, so that we could more easily track things in-real-time; we could easily remove this limit and just let it run for everything (most it would run at a time would be 200 or so).  We choose not to, because our users like to watch things process. (^_^)

I would limit your setup to running a small number of things at first; you can increase this number slowly to find the point where it becomes unstable - then build around that.

Top Labels in this Space