Create large list of things

Does anyone have recommendations on how to create new items for large lists of things (each with 1 field) quickly (Even under 30 min would workšŸ˜‚) ?

Weā€™ve tried both the recursive route and api workflow on a list but the list sizes are 8-12k and users will be doing this a couple (2-5) times per day.

Backend workflow with even a 1 second delay (not scaleable) would have a 10k list expected to take 166 minutes or almost 3 hrs to create 10k single field text records.

Weā€™ve tried to split it into a 4 bucket setup where 25% of the total list went to each bucket to in theory cut the time down by 4x but we destroyed our professional plan server within seconds of the workflow starting.

2 Likes

I wonder if the data API might be worth a try - up to 1000 items per request. Iā€™ve never had a need to do anything like that, so let me know if/how it works out.

-Steve

2 Likes

Will definitely test this out and see if itā€™s any different from going through back end workflow and post an update on my findings.

Thanks.

Iā€™d like to know how also. Given its just one field, could you try an alternative approach of just storing the list as a single delimited text field CSV or JSON? Maybe not, but worth thinking about. I do it with ā€œscoresā€ eg
1,2,3,4,5,1,2,3,4,5,

Rather than create 10 objects in the DB.

Hmmm that would work, I guess I didnā€™t forget to clarify itā€™s the 1 text field but also linked to a user so 2 fields total. But in theory may be able to get something like that to work.

A field on the User ā€œmy list delimitedā€ containing

red|orange|green|blue|yellow|red ā€¦
:thinking:

In my app I have 1000ā€™s of users, all of which have 100ā€™s of scores (which could mean 100,000 DB objects) so instead I have a string of hundreds of numbers on the User. (more or less)

And then I user regex to edit the string to change individual scores - Itā€™s kind of mental but I couldnā€™t find a better way.

Nowadays Iā€™d probably store it as JSON and use JS functions rather than regex.

2 Likes

Hmmm will try this, for flexibility in future would prefer individual data type incase we add more fields but thatā€™s 100% an option if we never expand fields

Yeah, good idea. A large delimited string might work, depending on the requirements. Itā€™s especially easy with the recently added :split by operator. Iā€™ve used a similar approach as a convenience when a custom data type would be overkill and/or speed was a priority.

-Steve

EDIT

Just keep in mind that if the string is really large, it will consume more browser resources. A ā€œproperā€ and more flexible approach would be a custom data type for sure.

3 Likes

Have you considered a different backend for this use case? Today I created at least 600 records in 1 second in google a firestore.

We can make there and bring it back fast!

Learn more

Need to have your database located in your region! Tired of struggling with Bubble or APIā€™s? Need a little functionality thatā€™s not available yet thru current market plugins?

Are you ready to step-it-up or speed it along? Need some custom code or a plugin built? Contact me today to learn how to book a 1-on-1 session, get your plugin built, or yet freelance building support!

3 Likes

Hi - Iā€™m curious about the use case for a process like this (I hope I donā€™t run into it since it sounds like a real challenge).

This has been by far the fastest route, 1k items taking <60 seconds. Came with a major downside though, list sizes of 100/500/1000 (max per call) seem to always throw an error at the end on every single call, thus stoping any later actions. Also after multiple tests on occasion it fails part way through and doesnā€™t finish the entire upload. Sense these fail mid way there actually isnā€™t an error being returned due to it being a system failure. Would be good to note iā€™ve maintained keeping it under 20% capacity on pro plan with this method. Iā€™m actually formating 100s of rows of the data on page using a :findandreplace + dynamic fields so i donā€™t have to manually set hundreds of items conditionally.

let me know if you want me to dive deeper into findings.

1 Like

I really appreciate the follow up, Chris. Bummer about the errors, but hopefully, you manage to come up with a way to handle or avoid them. I suspect every use case will be a bit different depending on the amount of data and processing involved. Faster but harsher on capacity seems to be the take-away.

Thanks for sharing the results of your findings.

-Steve

1 Like

When you say ā€œbring it backā€, do you mean it lives in firestore permanently and you retrieve it when needed from there? Or do you mean you integrate it back into the Bubble database later, and if so, how?

-Steve

Thanks for the question @sudsy . I suppose it really depends on your needs. But as an example,

using a firebase function, i update daily a list of ~1400 objects in firebase (not firestore though i use that for other things). I can create that and then retrieve the list in bubble in seconds.

using a custom built plugin, i get the data to the plugin via the firebase api. in the plugin i create an api call with a defined data type that matches my return data from FireBase. I then create an element and set an input field to be app type, and an exposed state to tbe that app type as a list. in the code i get the list from firebase, and ā€˜forEachā€™ the returned data into a list that i publish as the exposed state.

so in my example, it does live there but can be retrieved at anytime. if i needed things to run in the backend, i would create actions instead of elements to get and process the data.

1 Like

Thanks, Jared.

Yeah, for sure. I was thinking if the requirements called for the data to be captured and accessible as quickly as possible, it could be stored in a ā€œfastā€ external system (such as firebase); but then it could be brought back into Bubble during ā€œoff peakā€ hours via a capacity-friendly recursive workflow, for example.

Lots of variables to consider, but generally speaking, I like the simplicity and flexibility afforded by having Bubble be the long-term data store.

-Steve

1 Like

also a good option! i find that sometimes having it all in 1 system can be really really nice. Sometimes it makes much more sense to split it up or to offload the backend completely but then we get away from the convenience that is Bubble. The nice thing about Bubble is that we have that option IMO

3 Likes