Import from Google Sheet using Blockspring

Ok, so I can get Google Sheets data into a Repeating Group.

Now I am thinking I would use this to import into a table.

Harder than I though. I can put a button on each row. But that is long winded.

I could, maybe, have a button I can press many times and use #itemnumber and incremement this.

Am I missing something obvious ?

Hi @NigelG I’m doing the same this morning.
My present use case is that I’ve to import more than 465 products and I can’t use Import csv feature. For some products I’ve to link them to new Categories, create new Characteristics according each type of product.
@Emmanuel, do you think adding this loop feature (or equivalent) on your roadmap ? a new feature request maybe ?

1 Like

I did wonder if you could use a “list of things” somehow, and update them.

We’re thinking about adding a way to run bulk operations in the editor, using API endpoints that have been defined (and that we haven’t made public yet, sorry, the break isn’t helping). that should help with this (while letting workflows running in loops is a bit more of an issue for now).

1 Like

Thanks @emmanuel

I can see, I think, the issue with looping workflows.

But there does seem to be a well defined use case here that Blockspring could handle well.

Anyway…here is a possible way to do this…

I have an import page with a counter on it, that imports that counters #itemno from the Google Sheet every 5 seconds, then updates the counter, and displays the imported row on the screen (so I can see it working).


Ok, it is a bit slow. But it works.

1 Like

Thanks a lot @NigelG. I haven’t think about this way and I’ll do this.

@emmanuel. Looking forward to see this because I don’t figure out clearly how that’s gonna work. Another use case for instance I wonder if it would work, is personalize email’s content sent to several people.

Another very important use case is simply the ability to import client data from other sites/databases.

One site I am using for testing with Bubble has 300,000 case records. There is also an event table that has 6 million event records (average of 20 events per case).

Using the current upload csv only allows me an agonizing 15,000 records at a time and with a wait of about 1/2 hour to bring them.

Plus, like you have pointed out, there is no way of running workflows while you bring each record.

I do think that having a workflow that perform bulk operations on a set of data is critical. Other tools call it batch programs or processes. then we can use it for all kinds of future operations like printing and reporting as well…

Any updates regarding how this should be currently done?
I have Blockspring connected to Google Sheets, I’d like to first display the data in a repeating group, then after correcting/validating the data press an ADD button to add the records to the Bubble DB.