Import processing failing on unrecognized address

I have a table of 2.7 million rows that I’ve split into 3 maxed out excel CSV files. I’m trying to import it into Bubble.io. I can get it to Upload but then during processing (that also says it’s going to take 7 days…but that’s another issue!) it errors out on an unrecognized address. I can accept that there are some addresses in this that are going to be incomplete or wrong as far as Google’s apis are concerned, but I’d rather get it all imported and then deal with incorrect address when one is returned from a specific query where the API is called for that ONE address verses having to do it for all 2.7 million (and take 7 days). Any insights? Thanks!

1 Like

I had a recursive workflow set up to process addresses from a text value into a geographic address…some of the addresses were not recognized and the recursive workflow kept trying to change that same address and never moved onto the next entry.

I don’t know how you are processing your addresses or anything else, but the way I solved my problem was to add a yes/no data field. I set the value to change to yes when the workflow was triggered for so that all entries that were getting processed initially had that field changed to yes, basically indicating processing started.

Then when the address failed to get translated into a geographic address the recursive workflow did not get stuck in repeat as I set my conditions for which entries to being processing when the workflow triggered again based on the yes/no value.

So, anything that had started processing would never have a second attempt initiated and it just moved through the entire list one by one.

Then I parsed the data to find the entries whose geographic address was empty.

1 Like

@boston85719 I believe I have this set up correctly now based on your suggestion, would you be willing to take a look at what I have and confirm if so? I’ve run it on a sample of 200 and it seems to work, but now running it on 95K it doesn’t. Really having some issues with getting my 2.7MM rows imported! Thanks, Mark

If it is working on 200 it should work for 2.7million.

The reason it might not go beyond that could be based on your app capacity maxing out at a certain point and the whole system crashing on you.

Check your logs to see what kind of capacity has been used during the times you were running the workflows. If the capacity is an issue, you may want to make the recursive trigger be more spaced out, so not trigger at current data/time, but trigger at current date time plus 5 seconds or so.

Obviously that would add time to completion but it might be what needs to be done.

I ran mine on 5,000 or so and it worked all the way through after experiencing an issue with capacity reaching max and then spacing out the recursive flows.