I’m am trying to upload a csv of data that has 4,089 lines or rows, and 6 columns. One column has a unique address, formatted in the cell as, ‘4308 Victory Ave, Columbus, OH 45208’, other columns are just text and numbers related to the address.
I’m aware of there being a Google query limit for addresses, but I can’t seem to break down the upload into any reasonable chunk size, e.g. 900, 500, 250, and 100…I’d love to skew closer to 900 as chunks of 100 would entail 45 days of uploading just to finish the full ~4K row data set. I run into these 3 issues after I click ‘validate data’:
1. Initially tried to break ~4K entries into chunks of 900, this would result in hitting the query limit or one of the 2 issues below
2. When I tried to submit chunks of 100, 250, and 500, I would hit one of these 2 issues:
a. “There was an error analyzing your data: Error: u.run_once timed out after 30 seconds”
b. It tells me a particular entry is not an address, when it is an address and I can submit it individually as so…it won’t let the validation process continue after this -> I’ve tried using the address format stated above and lat/long as well to make sure that wasn’t the answer
Any strategies on how to get my 4K entries up in reasonable chunks? As a side note, I’ve also tried uploading 900 rows with the address actually uploaded into a field type that was ‘text’ instead of ‘geographic address’, and that worked…then I put in the workflow to make the geographic address field used previously to equal the ‘text’ version, but that then breaks the code.