CSV Import proving impossible

I'm using AppSheet database tables, and am running into constant errors trying to import data into them. Since the database UI only allows pasting 500 rows at a time, I'm trying to use the CSV Import action.

The table in question is called EmailValidation and is very simple, having only 3 columns (not counting the _RowNumber and Row ID which are created automatically):

  • Email (Text) - this is the key
  • Result (Text)
  • Date (Date)

I have a CSV file with the correct headings - in fact I exported a CSV first and then modified it (adding values to the Date column) to try and make sure of that.

When I try to import, I get the error:

Unable to find the specified row from AppSheet database

Troubleshooting I've tried:

  • Checked locale of table and CSV file - both are set to English (United Kingdom)
  • Removed duplicates from the table
  • Tried with and without '_RowNumber' and 'Row ID' columns
  • Opened CSV in Notepad, removed an empty row and saved as UTF-8
  • Unticked 'Require' on '_RowNumber' column
  • Unable to untick 'Require' on 'Row ID' column, but it has an initial value of UNIQUEID which I can't edit
  • Imported a smaller subset of 9 rows successfully

I'm out of ideas now. Any help?

Thanks in advance.

0 12 755
12 REPLIES 12

@ShirleyN 

 

.

How many rows are you trying to import,

6754, but I also tried with 500 and got the same error. I'm also getting this sometimes:

'Invalid request provided to AppSheet database'

Have you tried with 400..499 rows?

Testing on a different table now, it worked with 499 and took 1:36 to complete.

Hi @alexIGC , are you trying to add new rows into Appsheet databases or write to existing rows? It seems like the system is having trouble finding the right row to write to. Another simpler approach would be to copy the entire CSV table into an AppSheet database if it's already connected as a table in your app. See here: https://support.google.com/appsheet/answer/10105821?hl=en&sjid=939279987211939572-NA#configure-table 

This would be my preference, but the limit of pasting 500 rows at a time is problematic. In this case, where the CSV only contained ~6000 rows, it was manageable to do it in chunks. But I also have another table to import which has many more. That's the main reason I'm trying to import from CSV.

I've also tried Copy to New Source, but this seems to copy the data into a random new AppSheet Database file, rather than the one I have created and connected. I don't think there's a way to copy the table to the correct database file after creating it, is there?

Do I need to proceed on the basis of CSV Import simply not being a workable feature?

I played with this functionality.
#1 - When using gSheet and 1000 rows, it takes just a few seconds to import, with 5k rows, it takes 4 secs
#2 - When using ASDB having my own key column specified in the app editor, the import fails with an error "ErrorDescription: Unable to find the specified row from AppSheet database"
#3 - When changing the key column to "Row ID", it imports the 1000 within 5 seconds. with 5k rows, it takes 1:34.

Thanks for taking the time to test this out! Based on your results, would you agree with the following?

  1. CSV Import to gSheet is more robust and works much more quickly than ASDB.
  2. If running CSV Import to ASDB, Row ID should be used as the key column. Perhaps this only applies to importing to existing entries or 'large' datasets, as I think I had an import of 499 records work fine yesterday but they might have been all new records.

All in all, it looks like ASDB won't work in my use case which is a shame as I really like the UI and the Ref column feature. I need to use a website domain name or email address as my key column to avoid duplicates.

One workaround.. add a dummy gSheets to your app and import the data into that table. When the import is done, use a Bot/Webhook to copy those rows. If you don't have any virtual columns in your target table, it should do it quickly.

Interesting solution indeed, thank you. That might be enough for the short-term fix.

Extending the idea further, is there any way that I could check for duplicates with such a bot, and modify the behaviour accordingly? I.e. update records when they already exist, add new when they don't?

I'm thinking that the dummy sheet can act as an import area, and every time a new row is added, the bot is triggered to process it into the main DB and then remove the row from the dummy sheet.

Create two Bots..  where the webhook is an update or adds depending on real data. You can write the filter to your json.

EDIT: And then delete all data from dummy table with the second Bot.

Top Labels in this Space