I’m excited to announce the public beta launch of our overhauled CSV import system (the one in the “Data” tab). This improvement should allow importing much, much larger files than before, and the majority of work is done on the server, meaning you don’t have to leave the tab open while the upload happens!
The upload flow is similar to the old upload flow: select the data type, select the file, and map your fields. Validating data is now much faster; however, it no longer validates the entire file, only the first couple of rows, to make sure the file appears correct. Then, when you hit upload, the process will first upload your file to our servers, and then we will analyze it in the background. Once this happens, you are free to close the page, and you will receive an email when the upload completes. If the upload happens to fail (for instance, if your file is missing a quote on row 900), you will get an email telling you exactly what row the upload failed on.
Here are some things of note with the new system
You can only have one upload going on at the same time per application. This includes “Modify data via CSV” as well, as those go through the same system.
Any upload consumes your own server capacity, so you may see a small bump while the upload is in progress.
If an upload fails on, say, row 900, then rows 1-899 will be successfully uploaded (you can verify by checking in the data tab). So, when you fix the issue, delete the earlier rows from your file so they are not double-uploaded.
These changes only apply to uploads in the Data tab, not the workflow action.
Please try it out, and let us know of any feedback! Thanks to all of our Alpha testers for testing this feature out. If you encounter a bug, please file a bug report.
While the upload is in progress, you will see a progress bar and an ETA for when the upload will finish. You can also look at the data tab and see the data come in in real-time.
Thanks @cal@allenyang for working hard to develop this important upgrade!
I was part of the alpha testing an can attest that to uploading over 1 million records on multiple occasions. The csv file took under a minute to upload (depending on your speed) and the ETA was on average 1.1 days for the server to process the million rows. You can close your app or continue working on it while the backend processes your csv file. You’ll receive an e-mail once the process is complete. We can now upload millions of records in a reasonable amount of time!
I just pushed a fix to a seemingly similar bug - would you mind trying again and see if it works now? If it continues to fail, please submit a bug report so I can take a closer look.
Hi @andyestridge. If the file is in UTF-8 but is giving you that error, then it is probably a bug. Could you submit a bug report, sending in the CSV file you are having issues with, so I can take a look at it? Thanks!
Any chance you’ve looked at it? Sorry to bug you, it’s just that i had it all tested last night and was planning on applying things to the live app today
From investigating logs I can see that it triggered the ‘invalid UTF-8 file’ because there was a lone invalid character far down into your file. I’ve increased the intelligence of this check, so it should allow for things like that (because they do crop up in valid data), and the change should be going live later today. As a remedy if you need the upload before then you can find + remove that character from your file.