Hey AppSheet Community,
I recently ran into a frustrating issue with AppSheet's "Call a Script" feature, where any script execution lasting more than 60 seconds was being canceled and restarted; this became a bottleneck for my platform that relies on long-running processes.
To work around this limitation, I built a system using Google Apps Script to run processes every 5 minutes without relying on AppSheetโs built-in automation. Hereโs the breakdown of how I solved it, and Iโve provided the functions I used to help anyone facing similar issues.
1. Get and Cache Data
First we need to gather the data from the data source that needs to be processed, then put this data into cache for the other functions to pick up. Easiest place to get this is directly from the App (via the API), but if you don't have an enterprise license you won't have access to this; so I'll assume you've got data in a google sheet, and work from there.
function getRecordsToRun() {
var sheet = SpreadsheetApp.openById('YOUR_SHEET_ID').getSheetByName('SHEET_NAME');
var data = sheet.getDataRange().getValues();
var headers = data.shift(); // Extract column headers
var triggerIndex = headers.indexOf('Automation_Trigger');
var timestampIndex = headers.indexOf('TIMESTAMP_COLUMN');
var trigger_statuses = ["make_pdf", "send_email"];
var records_to_process = data
.filter(row => trigger_statuses.includes(row[triggerIndex]))
.sort((a, b) => new Date(a[timestampIndex]) - new Date(b[timestampIndex]))
.slice(0, 20)
.map(row => {
var row_object = {};
headers.forEach((header, i) => {
row_object[header] = row[i]; // Map each header to its corresponding row value
});
return row_object;
});
// Store data using the cache-splitting function
putCacheWithSplit('records_to_process', records_to_process, 300); // Cache for 5 minutes
Logger.log("Putting up the following records for a run:\n- " + records_to_process.map(record => record['ID_COLUMN']).join(", "));
let number_to_process = stalledRuns.length;
if(number_to_process > 0) {
scheduleChainRunProcesses(number_to_process);
}
deleteUnnecessaryTriggers(number_to_process);
}
2. Caching Utility Scripts
To make things work smoothly, you'll need a few utility scripts.
function putCacheWithSplit(cacheKey, data, expirationSeconds) {
var cache = CacheService.getUserCache();
var jsonData = JSON.stringify(data);
try {
// Try to cache data as a single entry
cache.put(cacheKey, jsonData, expirationSeconds);
} catch (e) {
if (e.message.includes("Argument too large")) {
// If the data is too large, split it
splitAndCache(cacheKey, jsonData, expirationSeconds, cache);
} else {
throw e; // Re-throw any other errors
}
}
}
//~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
function splitAndCache(cacheKey, jsonData, expirationSeconds, cache) {
var maxLength = 100000; // Approximate max size for a single cache entry
var parts = Math.ceil(jsonData.length / maxLength); // Calculate how many parts are needed
for (var i = 0; i < parts; i++) {
var partKey = cacheKey + "_part" + i;
var partData = jsonData.substring(i * maxLength, (i + 1) * maxLength);
cache.put(partKey, partData, expirationSeconds);
}
// Store the number of parts and mark it as split
cache.put(cacheKey + "_parts", parts.toString(), expirationSeconds);
}
//~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
function getCacheWithSplit(cacheKey, cache) {
if (!cache) {
cache = CacheService.getUserCache();
}
// Check if the data was split
var parts = cache.get(cacheKey + "_parts");
if (parts) {
// If split, reassemble the parts
var jsonData = '';
for (var i = 0; i < parseInt(parts); i++) {
var partKey = cacheKey + "_part" + i;
var partData = cache.get(partKey);
if (partData) {
jsonData += partData;
} else {
return null; // Part is missing, return null
}
}
return JSON.parse(jsonData);
} else {
// If not split, return the single cache entry
var cachedData = cache.get(cacheKey);
return cachedData ? JSON.parse(cachedData) : null;
}
}
3. Actual Processing Functions
Now you'll need to create a set of functions that serve as the entry-points to triggering your actual functions. These functions are specially designed to check cache for contents that they need to run, and if they find anything run the actual function requested.
function processRecord(index) {
var records_to_process = getCacheWithSplit('records_to_process');
if (records_to_process && records_to_process[index]) {
var actual_record = records_to_process[index]; // This is an object with column name/value pairs
var record_ID = actual_record['ID_COLUMN']; // Access values using the column names
var automationTrigger = actual_record['Automation_Trigger']; // Get the trigger word from the column
// Call the appropriate function based on the Automation_Trigger value
switch (automationTrigger) {
case 'TriggerA':
my_function_A(record_ID);
break;
case 'TriggerB':
my_function_B(record_ID);
break;
case 'TriggerC':
my_function_C(record_ID);
break;
default:
Logger.log("Unknown Automation_Trigger: " + automationTrigger + " for record ID: " + record_ID);
break;
}
} else {
Logger.log("No run data for index: " + index);
}
}
//----------------------------------------------------------------------------------
function processRecord_0() {
let index = 0;
processRecord(index);
}
function processRecord_1() {
let index = 1;
processRecord(index);
}
etc...
Why 20?
Google Apps Scripts has a limitation of 30 "simultaneous executions per user" - meaning per account. So you can only have 30 things running at any one point for your account, you can have more if they're using a different account (running the same script) - but we're practically limited to 30 executions at one time.
___________________________________________________________________________________________________
4. Creating Timed Triggers
Now that you've got the scripts put together and everything is ready to rock, now we need to create the triggers for each of the functions. Since there's so many of these, and they're all named essentially the same things just with a different number, we can programmatically create these easily. Here are a couple of functions you can use to accomplish the creation and deletion of the triggers for the main processing functions:
function scheduleTriggers(totalRecords) {
if (!totalRecords) {
totalRecords = 20; // Default to 20 if not provided
}
// Check if 'getRecordsToRun' trigger exists
var existingTriggers = ScriptApp.getProjectTriggers();
var getRecordsTriggerExists = existingTriggers.some(trigger => trigger.getHandlerFunction() === 'getRecordsToRun');
// Only create 'getRecordsToRun' trigger if it doesn't exist
if (!getRecordsTriggerExists) {
ScriptApp.newTrigger('getRecordsToRun')
.timeBased()
.everyMinutes(5) // Set to run every 5 minutes
.create();
Logger.log("Created 'getRecordsToRun' trigger.");
// Wait time to ensure data retrieval script finishes before the processing triggers start
let wait_time_seconds = 10; // Specify the wait time in seconds
Utilities.sleep(wait_time_seconds * 1000); // Pause execution for the specified time
} else {
Logger.log("'getRecordsToRun' trigger already exists.");
}
var triggers_made = [];
// Create 'processRecord_' triggers up to the number of required records
for (var i = 0; i < totalRecords; i++) {
let functionName = 'processRecord_' + i;
// Check if a trigger for this process already exists
if (!existingTriggers.some(trigger => trigger.getHandlerFunction() === functionName)) {
ScriptApp.newTrigger(functionName)
.timeBased()
.everyMinutes(5) // Set to run every 5 minutes
.create();
triggers_made.push(functionName);
}
}
Logger.log("Created trigger(s) for: " + triggers_made.join(", "));
}
//~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
function deleteUnnecessaryTriggers(totalRecords) {
if (!totalRecords) {
totalRecords = 0; // Default to 0 if not provided
}
// Get all existing triggers
var triggers = ScriptApp.getProjectTriggers();
var removed_triggers = [];
triggers.forEach(function(trigger) {
var functionName = trigger.getHandlerFunction();
// Delete unnecessary 'processRecord_' triggers beyond the required number
if (functionName.startsWith('processRecord_')) {
var index = parseInt(functionName.replace('processRecord_', ''), 10);
// If the index is beyond the total number of records, delete the trigger
if (index >= totalRecords) {
ScriptApp.deleteTrigger(trigger);
removed_triggers.push(functionName);
}
}
if (totalRecords == 0 && functionName === 'getRecordsToRun') {
ScriptApp.deleteTrigger(trigger);
removed_triggers.push("getRecordsToRun");
}
});
Logger.log("Deleted triggers: " + removed_triggers.join(", "));
}
===================================================================================
This system will now begin monitoring your google sheet for anything that needs to be triggered; when it finds something it puts that data into cache, which is then picked up by other functions that then actually do the running.
Hope it helps!
Thank you @MultiTech for a nice useful tip.
thanks for your update and it's very helpful
I have a code list and a process to generate pdf forms based on the information obtained from that case code list. However, the time limit is not enough to generate all the pdf files according to the case code list. Please suggest me the best way to do it. Thanks!I have a set of case codes and a process to generate pdf forms based on the information obtained from that case code list. However, the time limit is not enough to generate all the pdf files according to the case code list. Please suggest me the best way to do it. Thanks!
When you start running into issues where the script process takes over 6 minutes, you definitely have a problem to deal with. The solution for me involves creating a way to keep track of the processing time, and when things approach that time limit - cache everything and somehow re-trigger another run. This way the execution that's running, which is about to timeout, can save things temporarily and pick up with another execution.
This means that you need to not only include a way to cache things, but also change your main script so that it pulls things from cache at the start - if it can - and continues along where it left off.
Building this sort of system requires a lot of moving parts; thankfully if you're persistent, and use Ai to help you, one can get around this limitation.