Creating a Search that is fast and easy on WUs without external services

In this thread, I just wanted to share how I created a blazingly fast Customer search for my app’s CRM without any external services. The CRM is used in front desks daily so speed and real time data will always be a priority even with WU looking over my shoulder every day.

It comes in a few different parts:

  1. How the search works
  2. Creating and maintaining a search index
  3. How I load and store an index client-side
  4. How I keep an index updated real-time client side

How the search works
It’s actually pretty simple, I have a Repeating Group that displays any Regex matches from a list of text which is stored in a ListShifter (you can use a state if you prefer).

Each text in the list will contain something like this:
full name|contact number|email address|unique id of the customer

This basically contains the search parameters that I want to allow my user to use as search criteria (all this will run from 1 search input). It starts with a customer’s full name because I want to be able to sort search results alphabetically. It ends of with the UID of the Customer record because I will use it in other parts of the CRM, like loading all data on a Customer and their interaction records.

The Regex that checks is:
\binputvalue\b(?!@)

My RG points to the ListShifter’s list and the Regex is added as an advanced filter since it’s client side (:filtered). Now if my user enters either a customer’s name, contact number or email, my RG will always return the correct results.

Here are some things to note:

  • I have a :lowercase operator added to input’s value when building my Regex as a Bubble expression to reduce any potential headaches
  • My input feeds an “Instant Text plugin” element which acts as my Regex’s input value. I do this so the RG doesn’t have to wait for the value as is the case with native Bubble inputs

Creating and maintaining a search index
I have an Index - CRM datatype in my DB to store each Organization’s index. The reason I do this is so that I do not have to build the index client side every time my user load’s their CRM.

The list of search criteria

All indexes are maintained using backend workflows that look out for and add new entries but I handle deletes differently:

  • When a user “deletes” a customer profile > that Customer record has it’s “Mark Deleted” field set to ‘yes’ > a trigger will add that Customer record’s UID to a list of text field called “Marked for Delete” in an organization’s Index - CRM

The “Marked for Delete” field doubles as a constraint client side so it won’t show any results that have been “deleted”.

I will do a batch delete during downtime while updating each organization’s Index - CRM record appropriately. This serves 2 things:

  1. To reduce the chance that an Index gets unnecessarily inaccurate
  2. To better control my WU

How I load and store an index client-side
General principle of it is that when the app needs to load a User’s CRM index, it does a quick search to match an Index based on their Organization (I do have Privacy rules attached, but I’m extra weird). This greatly reduces the amount of client side searches by:

  1. Not having to run and load large searches for Customer records just for a simple search feature
  2. The Index - CRM datatype is a satellite datatype and will only have as many records as I have onboarded organizations and hence far fewer records then searching through the Customer records.

Because I am a WU scrooge, a user’s client will store the index into browser storage using the “Floppy plugin”. So if that user’s organization’s index is not updated, then no matter how many times they load into the CRM, it won’t do a search for an Index. This further reduces the need for unnecessary search workflows and allows instant search availability when a user loads the CRM

How I keep an index updated real-time client side
Organizations do have multiple users updating the CRM at the same time so keeping the client side index updated is very important. The other problem I wanted to solve was to reduce the amount of realtime data my app uses. The solution I found is also a simple one:

I created another satellite datatype called Global to store single dates and single texts. Each organization will have their own Global record. This includes an “Index Last Modifed” date field that gets updated in the backend whenever an Index - CRM gets updated.

So now I have only to look at 1 record’s date field in real-time. Clientside, a workflow will look for a change in the date and trigger a workflow to update the client’s browser stored Index. Since I can keep the Global datatype very light I don’t have to worry about WU usage too much.

Possible Asked Questions
I thought advanced filters are taboo?!
Since we are filtering a (preloaded) list of texts, it’s not going to load any additional data. As a side note the biggest index holds around 8,000 records.

Can I apply this to my app?
I honestly dunno, it works for me. Using something like Angolia is kinda overkill and not worth the additional overhead IMO for something this simple.

Can you keep on using this forever and ever?
The good thing about this is that if I wanted to I could just offload the index to my Cloudflare workers and storage and not worry about WU at all.

I thought you didn’t have a real app or any users?! You have no skin in the gaemme and you haven’t been here since 2018!
Silly me!

Wow! Were you high when you thought this out?
High on life and cigarettes! I have been optimizing this for months actually and it was the Bubble’s WU livestream that helped me to work out the kinks.

It’s not “a best method” of doing what I need in my CRM but it’s what works for me so far. I am more than open to suggestions or alternatives but I hope this helps someone.

Credit goes to @keith for Floppy which gives that extra oomph to my ERP. Highly recommended!

8 Likes

Congratulations! This is a really creative/cool approach to this this problem. And it touches all bases.
I have implemented something similar on a few apps, and I have been blown away by how fast extracting with regex on ridiculously huge text fields actually is. In my head, it’s another thing all-together from filtering on things, they should not be lumped together. But the use of Floppy and the ‘Global’ datatype is something I hadn’t thought of, really great stuff.

Do you have any updates/comments on this setup 8 months down the line?

1 Like

Thanks for the comment. Sorry it took so long to reply.

So here’s my takeaway after a few months:
. I underestimated my CRM users and the indices grew too large (but it’s also due to my initial implementation of which records should go into the index)
. I had to break each index into a list of 5k things because of the 10k list field limit
. With very large lists (12k) there will be a visible lag when the regex runs and an increase in load time
. Though I did discover that using a JS function (and running it with Toolbox) to run the regex matching is way faster than using Bubble’s operators

I still use text and regex for fast searching but for very large lists it’s better to use properly constraint searches since the WU cost from loading large lists can drastically reduce (maybe reverse) the WU savings.

1 Like