Best Practice for Data Recovery from "Recovery Mode" text files?

I recently had a user of an app manage to accrue 444 pending syncs on their app that had recently had a data schema change.

In attempt to help this user not have to re-enter nearly 12 hours of work, I moved the app into “Recovery Mode”. According to the documentation, this should ignore sync errors for the user and send all pending data in a Recovery.txt file to the app owners file storage.

Recovery mode did not allow the user’s app to sync, so we went the route of “Reset Changes” which also creates a recovery file. However, in this case, it created 444 recovery files!! Each one containing the data for a single sync operation.

The files are in JSON, which can be converted more or less easily to CSV / tabular format, or moved to the SQL database via appscript or something. But what I’m stuck on is how to process 444 individual files.

Anyone have any creative ideas to collate and parse the files? Maybe a Python boilerplate I can tinker with?

@Bellave_Jayaram @MultiTech_Visions @Grant_Stead @Jonathon @Derek @QREW_Ben @CJ_QREW

Solved Solved
1 16 1,900
1 ACCEPTED SOLUTION

For the second half of the fix, I wrote a couple of Javascript functions to parse the JSON data in the collated file and extract only the row data for each table that had sync data. I ran these functions in the Chrome browser console (because you can execute JS from there) and then copied the console log data into json2table.com to get it in tabular format. Then I pasted them into a spreadsheet to then clean for entry into the DB.

Interesting note: It turns out there were only 200 files (200 rows) in the Recovery File, not 444. I wonder if AppSheet caps the Recovery Data process at 200 sync items?

@praveen are you aware of any limit on the recovery file creation?

I suppose I could have written a few more lines of JS to just generate the html table, or even just run the whole bit in AppScript and used JDBC to go right into the DB… Next time

Here’s the code for anyone who has the same issue.

function getTables (data) {

    var tableNames = [];

    for (i = 0; i < data.length; i++) {

        tableNames.push(data[i].tableName);

    };

    var distictTableNames = [...new Set(tableNames)];

    console.log(distictTableNames);

    return distictTableNames;

};

function extractRows (data) {

    var tables = getTables(data);

    var numTables = tables.length;

    var betterData = [];

    for (j = 0; j < numTables; j++) {

        console.log("entering for loop")

        var thisTable = tables[j];

        console.log("looping through this tables rows: "+thisTable );

        var filterData = data.filter(table => table.tableName === thisTable);

        betterData.push([]);

        console.log(`${thisTable} has ${filterData.length} rows`);

        for (i = 0; i < filterData.length; i++) {

            betterData[j].push(filterData[i].row);

        };

        console.log(JSON.stringify(betterData[j]));

    };

    console.log(`Looped through data and extracted rows for ${numTables}: ${tables}`);

    return true;

};

View solution in original post

16 REPLIES 16
Top Labels in this Space