Creating large data sets (Looping)

Morning All

After help around creating a large amount of records or something similar. I have been using Bubble for a long time now, but never got to completing this and hopefully with forum help i can get it over the line.

My app consists of a product list of say 4000 items (product).
The products all belong to a specific category (category) and they are either a standard product or a rules based product.
I have a series of (questions) that a user fills in, these questions when answered provide a way to gather the rules based products.
Once filled in then i need to create the order and place products on to the (ordered products) table.

So the easier one to start with, a user starts a new order. As part of this order they select which (category)'s they want. They then click submit order to get things moving.

What i need here is to go through the (products) and if the product is a standard qty and the category is in the selected categories, then create/copy the product to the (ordered products) table.

How do i accomplish this successfully? From past experience anything over 100 items, caused missing records when doing a scheduled workflow on a list. Not sure if a recursive list can be used (not sure how to do them) but any advice here would be really appreciated.

Also on this, when testing with a simple 50 products it caused “server capacity issues”, never had these before, so assumed my workflow logic was flawed.

If i can understand this and get it working then i can move on to the next part which is the questions and adding this to the workflow.

Many Thanks for reading

Yes a recursive workflow can be used for this. It’s easier on capacity because you can process X amount of items at a time with set delays in between depending on how hard you want to hit your capacity.

This is the basic idea of a recursive workflow

1 Like

This sounds like the most simple thing possible. Can I assume you are using a lot of crazy filters or something (which killed your capacity)?

But even then, most filters are client side, so it doesn’t make a lot of sense what you are describing.

Perhaps you can offer up some concrete examples for us to help you better. But I can say, if you have a list of products which also denote their category, a very simple search products should be able to deal with it quickly. Now, if you are using a lot of differently types of filters, intersects, etc, then that might be an issue.

I’m sure we can help you easily, but we need a lot more details.

To be fair the “Schedule API on a list” workflow is straight garbage when it comes to lists of any decent size and is really slow (and should be reworked to do it’s own recursive workflow with % complete value)

1 Like

You might also be able to use ‘copy list’.

1 Like

Hi Nigel

Started to look into this, but it does say copying anything over 100 is an issue. Performance wise i am assuming this would be quicker than individual records? To use in my scenario would i have to create batches or some way of identifying the first 99 from the list, then the next 99 etc

thanks

1 Like

Thanks Tyler

Think this may be the way to go. I am curious though on how to establish the list to start with?

Lets say a user selects all categories, this means around 3,500 products would need to be created. Looking at the speed of Bubble (and using my schedule workflow on a list), it is adding around 20 products per minute, this means around 3-4 hours to generate the data.

Might be just me, but that is far too long (considering we have been using excel based system and it took about 2 minutes to generate). Am i missing a trick here, is there a easier way to identify the product to the order?

Might be just me, but that is far too long (considering we have been using excel based system and it took about 2 minutes to generate).

Have you looked at using the Data API Bulk Create endpoint?

In my experience it’s considerably faster than using a recursive workflow to create database entries…

In a simple test I just ran, to create 1000 database entries with a recursive workflow took just over 9 minutes… with the Bulk Create API it took around 50 seconds.

Obvously, how long it takes for you will depend on exactly what it is you’re adding to the database, but the API method is definitly faster in my experience, even with more complex data.

Am i missing a trick here, is there a easier way to identify the product to the order?

I can’t quite work out exactly what you’re trying to do from your original question… but I can’t help but feel that this might not be the best way to go about it…

Do you really need to create these new DB things at all? Can’t you just create an order and add them to the order?

2 Likes

Thanks for the reply Adam

Not looked into the bulk API, will take a look now, the time it took for you is hopefully achievable for my data.

In essence yes your right, the user creates an order and i need to link the products to that order.
Example: User creates a new order to be delivered to a store, workflow creates simple meta data of Store Number, Order number, date of order etc.

They then select what “categories” are within the store, so imagine a big shopping centre, the user might pick “Hardware”, “Electrical” as categories. The selected categories then relate to a list of products contained in that category, eg electrical might have 200 products, all of which need to be part of the order as they have selected that category.

For the end output i just need to be able to generate a list of products for that store, this then needs to be checked off prior to despatch.

So i was trying to create new records to an “ordered products” table, in here is a few extra columns that allow me to check off that i have added the product to the pallet(s). A further curve ball later is that the user would review the data and have the potential to increase the amount needed, all products have a standard qty amount, but in some larger stores they might need to double this.

Sounds like it’s the typical shopping cart system where you need to make new datatbase entreies with the quantities, and a field linking back to the original product. It’s just the size is much larger.

Seems like no matter what process you go with expect it to be 30 seconds or more to make all these items so make sure you have something ready on the front end showing your users it’s generating for a little bit.

Copy a list action won’t work here because you are jumping data types

I did a test with a recursive workflow but having it process 20 items on each loop instead of 1 took about 3 minutes to create 1000 items from another data type. But it hit my Personal plan capacity probably more than I would like. But use that as your backup plan and see what the Data API can do for you.

I totally forgot about the Data API like @adamhholmes suggested, sounds like that would be very useful here.

And why are you items in a list? That’s a bad plan. I use small lists in my app, but not 4000 varying things.

Hi Adam

Had a look into this and think it is the right way to do things. Looked through various tutorials, but really stuck trying to essentially “copy data” from one table to the other.

Would you be able to show your API call(s) and the workflow.

When i try only 1 record is created, the get API works fine as it populates a repeating group with all the products.

Many Thanks

So managed to get this working in a fashion.

The issue i have is around the ID from one table and putting that into another.

In this instance, i am wanting the “product” ID to be placed into the ordered products table, i can get text fields, number filed all pulling through but can’t get the ID, tried it as a text field and a number field but nothing.

Read on the forum a few other issues similar to this, would be interested in finding the solution, as Adam said, it took around 1 minute to do best part of 2000 records

Screenshot what you have so far, I’m assuming you’re making the API call by using the :formatted as text operator to craft each JSON object? What do you mean by ID, the unique ID or is this another text field?

Hi Tyler

Yeah using the :formatted as text.

All fields pull through apart from the unique ID (When i omit that from the workflow)

Screen shots below, i have added a “thing” called test, in here it has a couple of fields with one field called poschk been the product, this should contain the Product id?






Your “pschk” field is type Positem so you have to give it the unique_id of the Positem, then it will link up properly in your database automatically.

So when you gave it the “NSFEL0010” it has no idea what that was since it’s not a Positem’s unique ID in your database.

Also the “DataGrAB” is that another internal call you are doing or is that an external API?

I have been trying that, changed it to pick up the _ID, chose to “Get” this as a text field and a number but no success.


snip8

Below is the options to pull through for the _ID, it doesn’t reference the Thing “positem”, don’t know if the ID by default is a text string or number, either way it is not working.

The ID should just be text.

But also I’m a little confused, what’s the DataGrAB API you’re calling? It seems like you’re using it’s response to fill in values for the Data API bulk creation, but can’t you just directly search for your items and bulk create from that list?

Normally I think you just form your body for the call to the Data API (From a search or known list of Things), and it creates those things. returning a status of each item if it was successfully created or not with it’s unique_id.

Thanks Tyler, really appreciate the input

Just in case anyone else searches this and wants to know what i did to get it working.

Using the API to insert the ID from a different table, just didn’t work, not sure on the reasoning but it didn’t run no matter what i tried, i have a support ticket open so will post if i get an answer.

Instead base it on a search and ensure you put the ID inside speech marks.

2 Likes

Good work at the bleeding edge of Bubble backend data manipulation.

1 Like