I’ve been using Eli’s awesome plugin for a while now to bring in some complex CSV files on a web page.
Our client (bless 'em) has gone a little nuts and and gone from regularly importing 3700 records last month to trying to bring in over 5000 records and the import seems to timeout.
The fields are ID’d correctly, but when we process records, I see the first pass of processing go through then it stops. Usually I see additional passes equivalent to the number of files it needs to create in 100 record chunks) and I see the updating file logo just keep going till I refresh
Can’t see anything in the logs, cant see any errors in Chrome console. My only fix is to break the file into 2 sets and import, then run a merge later. The original CSV is 373K, I can successfully bring in 4000 records that equates to 276K. Not sure if the limit is row based, size based or something else.
Im running on the professional plan. Any suggestions for where to look?