So, @rpetribu, it’s basically for situations where a count that is “close enough” will do, and you want the count to return a lot faster than an actual count would. It makes me think of something like a dashboard with large counts on it (maybe company-wide metrics with numbers expressed like 5.25M) where nobody is going to know (or likely care) if the counts are off by 2%, but they might care if the dashboard takes a long time to load because all of the counts are 100% accurate. I assume it could also be used in conditions where an approximate count would be good enough in order for the condition to evaluate a lot faster.
Anyway, that’s the way I have been thinking about it… maybe others will have a different mindsets or exact use cases, though.
Yes yes… but it seams odd to me a dashboard that gives me different results everytime I load the page… Or even two users, looking at the same dashboard, will have different results. Specialty for big companies, where a deviation of 2-5% over “milions of items” makes difference…
You’ve asked quite a few things. I am bit short on time for doing all of what you said, but to answer a specific question of yours about how I reliably I am able to reproduce. It is quite reliably being reproduced, and inaccuracy is quite high.
I have a table where I show numbers based on some chosen filters. To showcase for your use-case I added two text elements for the same count, one shows exact count, and other one shows approximate count.
In a few different tabs that same table is showing results like this.
In each example below the first line is for “approximate” number and the below one is for the exact count.
Example 1
Example 2
Example 3
Needless to say, it is not of much use to us at the moment. Sending a bug report is a lot of effort with giving steps to reproduce, login etc, so maybe I’ll do it when I have some more time or this becomes critical for us. As of now I am happy moving back to actual count instead of approximate.
I’m storing data in a dedicated datastore item on every page, using also a refresh state to update all entries.
I tried to use the “caching” feature, but it’s changing the data type, as far as I see:
what you are trying to do, i guess, is store events so they dont have to be loaded again.
that doesnt work as of current.
what you can do is just store a list of events state: do a search for events, and then refer to that state, so that when a different group/repeating group is loaded you can refer to that precalculated state instead of calculate something new on the spot, which takes longer
if you have a huge database what you can do is save searches that come frequently in another datatype that is more lean.
or use algolia which uses something similar to be faster