Need suggestions for most perfomant way to do filtering on repeating group

I have been going through and rebuilding all my pages with flex. So far I’ve seen a big performance increase just alone in not having to use a lot of group and elements to get things pixel perfect. I’m trying to stay above 75% Gtmetrix score having a bit of an issue when it comes to filtering. The old way I went about it seems to not be very performant as my GTmetrix score dropped by almost 15% when I finished setting it up.

I have several repeating groups Like “Brands” for filtering which sets a state when the current cells thing is selected as a state. Then that state

Database setup like

[Product]Datatype
Field Brand
Field Finish
Field Color
ETC

[Price]Datatype
Field Price
Field Stock Status
Field URL
Field [Merchant]

[Merchant]Datatype < I have multiple merchants now more than 10
Field Price
Field Stock Status
Field URL

Repeating group for Filtering

Main Repeating group to be filtered

Filtering speed and efficiency is a bit of a rabbit hole, but one that’s worth going down. In addition to a lot of great discussion on the forum, I highly recommend this book The Ultimate Guide to Bubble Performance - the new edition is out (now 210 pages!) - Tips - Bubble Forum

From your description I can’t exactly make out how you’ve set up your RGs, but here are some general guidelines I follow that make things work faster for the user and minimize the Bubble capacity I’m using. Not sure if this translates 100% to whatever page score metrics thought.

  1. Keep the filtering server side (which it looks like you’ve done)
  2. Use a scrolling RG so that Bubble is only pulling a few records at a time
  3. Keep datatypes lightweight
  4. Set up the user interface so that the user selects filter options and then hits “search”, then that kicks off a workflow to set custom states which the RG filter refers to. This limits the number searches and retrievals you are asking Bubble to do.

There are also scenarios where if the total amount of data isn’t that huge, and the user is going to do a lot of filtering and fiddling with the data, it could make sense to bring all the data client side and then filter client side with “:filter”. But I’ve avoided that because it’s not scalable.

2 Likes

Thank you much appreciated! After doing some A/B testing with the filtering deleting groups deleted it seems that it comes from repeating groups. Not too much about how the data is setup in them. I’m going to try and think of some workarounds. Also I’m on the personal plan lately I’ve been thinking about upgrading too the pro plan as this should help a bit.

Hey @mack2580!


In that case, the system loads all products on the client-side (taking into account the set up server-side constraints, of course). So, if you have 1000 products, it loads all of them - and then unique it by the Brand.

@mack2580 you can also go into google inspect and take a look at what is being downloaded and the download time related to the various parts – and that will help you catch issues like @lottemint.md saw.

1 Like

Is there a better way to do this maybe via backend workflow?

Thanks for this interesting information!

I ended up using list shifter for the filtering and using custom states for the main repeating group. This seemed to get the best gtmetrix scores. As my old scores went from the high 50s and low 60’s up to around mid-80s. With sub 1.6s load time. Using the new responsive engine certainly helped too.

The best way is to have a separate table for Brands. So, you’ll need to link each Product (thing) to its Brand (thing).

This topic was automatically closed after 70 days. New replies are no longer allowed.