Performance issues and questions

I’m a bit of a late comer to the native mobile testing scene, but I’ve recently done a bit of informal benchmarking and wanted to share some results and ask a question.

I’m using the API Connector to retrieve a list of 10,000 items from a public API endpoint. The total payload size is ~3MB, and it’s being served from a caching layer - i.e. it’s fast.

For reference, when the call is made directly from a browser (not going through Bubble’s servers), it typically takes well under a second to receive the entire payload and another fraction of a second for Bubble to “process” the data and start rendering it to a RG such that the total elapsed time between initiating the call and the data appearing on screen is typically around one to one and a half seconds. (It’s also worth noting that the Safari mobile browser on iOS usually outperforms desktop browsers - even on an older iPhone.)

Native Mobile

For comparison, following are the results of my native mobile testing where only the first 10 of the 10,000 items are actually being displayed in the UI element.

UI Element Context Render Lag
ShortList Browser Preview 2-3 sec
ShortList BubbleGo 5-7 sec
SelectableList Browser Preview 46-48 sec
SelectableList BubbleGo crash after ~20 sec

Needless to say, the SelectableList performance is abysmal. Just curious if this is representative of what we can expect in the deployed app.

I realize 10K is a lot of items, but only 10 are actually being displayed, and 3MB is not a huge payload by today’s standards. :face_with_raised_eyebrow:

-Steve

Interesting stats. Bubble has noted in the manual that the ShortList isn’t best suited for long lists (I believe 10k greatly exceeds that), so not surprising to see it take that long to load – which is a really long time by mobile standards.

I’d be curious to know how these stack up against the VerticalList view.

Point 1: If you take a second look, you’ll see that (ironically) the ShortList performs acceptably. In fact, the ShortList can display all 10K without issue. It’s the SelectableList that takes exceedingly long and crashes.

Point 2: Only 10 items are actually displayed - not the entire 10K; so UI resources need be allocated for only 10 items.

You are correct that this doesn’t represent a “typical” use case, but I was doing a bit of “stress testing”, and I would expect native mobile to perform better than a browser on an actual mobile device. It doesn’t. :confused:

So use Short List for long list. Got it.

3 Likes

To your first point, I think that’s expected. By design (and for most use cases), Bubble would expect that SelectableList is fed with the lowest list size.

How’s the VerticalList stacking up against these other lists?

…unless you need interactivity. As I understand, ShortList is for output only - no tappy swipey stuff.

An app unexpectedly quitting is, by definition, not expected. :smirking_face: And 10 items seems pretty low.

I didn’t know this. You can’t put buttons or forms in it? Why? Is this going to change ever or is this meant to be permanent?

Just tried and indeed you can. I reckon it’s like a RG in that respect - i.e. a container with repeating content. Maybe it’s just gestures that aren’t supported for that element. I honestly haven’t experimented much with the various elements. I’ve been on a mission to test performance for a particular use case. Anyway, sorry for the misinfo. :neutral_face:

1 Like

Ah, ok. But good to know that there’s something weird happening in there with… something. If I ever have a problem with an event now I know to check.

Definitely odd that mobile browser outperformes native mobile app.

Wonder if there’s any slow down while going through bubble go?

1 Like

Yeah, that was exactly my main question, which I guess I didn’t actually phrase as a question (my bad). So I’d be interested to hear from the Bubble team or anyone who’s published an app as to whether the performance seen in BubbleGo is representative of a published app. I’m guessing there’s some performance overhead, but I don’t know for sure. Either way, the app should surely not crash.

Yes, I can confirm. The performance you see on Bubble Go iOS and Android will then be what end users will see when you publish the app.

1 Like

Short List and Selectable List are not meant to be used for more than 25-30 items because they are all loaded at once, even if only 10 are being shown in the UI. Selectable List specifically is an input element, which means its best used for creating different select flows, not long lists of data. As other users have mentioned, List Views are meant to show long lists of data because they use a virtualized list in the backend (Flash L:ist), so the performance should be quite good.

Would be quite curious as well how the Vertical List view stacks up in your performance benchmarking though :slight_smile:

Ok, thanks for the info, @nick.carroll. I will familiarize myself with more of the elements and test again.

Might be good to note that in the docs.

Ok, I clearly need to better understand the differences between the various elements with “list” in the name. :neutral_face:

haha yeah its not the most straightforward. I believe we do mention this in the docs, but will double check to make sure its more clear

Ah ok, I get it now. (I was just too eager to do some testing and didn’t spend enough time consulting the docs and experimenting.)

The root level in mobile design mode IS a view (roughly analogous to a page in web mode), and it has a View type property. The VerticalList element is automatically added to the view and can’t be deleted. (No wonder I couldn’t find it in the elements list.) Now I understand what @ayfolut was saying.

Follow up…

Ok, so just did a quick test, and the performance of the Vertical List view is 2-3 seconds - basically, on a par with ShortList when fetching all items but displaying just the first few. :+1:

It’s still not as good as a browser, but I suspect that’s due to the fact that I have Attempt to make the call from the browser enabled in the API Connector; and with a mobile app, the call is going through Bubble’s servers, which add some lag (about a second or so).

@nick.carroll, can you confirm if my theory is correct; and if so, is there any reason an API call couldn’t be made from a mobile app directly to the API endpoint? Not only are direct calls more performant and reduce Bubble server load, but they also consume no WU. Just as for web apps, it would apply only to data requests with no headers and no private params. Might this be an optimization you can make? (Maybe change the name of that option to Attempt to make the call directly from user device or some such.)

Interesting stat!

I’m also curious about the theory of calls going through the browser first. I hope @nick.carroll can shed some light.

I also tested on my end. I’m using an API call to return results in a Vertical List.

I’m getting roughly 1.5 to 2 seconds of delay before returning results.

I’m using algolia Rest API to return 100 results.

I do not have “make call directly in browser” checked.

Google maps API is also extremely slow. Would love to hear about ways to improve API speed.

Address results take around 3- 5 seconds :sob:

That’s about what I would expect, @brad.h. Of course, there are a number of factors that affect the time to render. Item count is one of them, but the following also factor into it:

  • Source of data (cache or real-time look-up)
  • Network activity
  • Overall load on the API provider’s infrastructure
  • Size of each item (and thus total size of the payload)

And even if you did, it would likely make no difference on mobile, as I’m almost certain that setting simply isn’t relevant (it’s ignored) there. (I’m hoping, though, that the mobile equivalent of it can be implemented by the Bubble team at some point.)

However, not all requests could benefit from it anyway. If Bubble’s servers are in the loop for any reason, that option’s not available. For example, if the call’s “Use as” field is “Action” instead of “Data”, or an authorization header is used, or a param is marked “private”, then it must go through Bubble’s server.

In my case, I’m pulling directly from a static cache, and the data is for public consumption, so no auth and no private params. Thus, the call is made from the browser, and it’s quite fast. Here’s a typical result from a browser once the cache is “primed”…

It’s actually pretty impressive for 10K items weighing about 3MB, and it’d be great if I could get similar results on mobile. (Of course, the response is compressed, so only 650K is coming “across the wire”.)

In your Algolia case, I’d guess a good chunk of the latency is the result of going through Bubble’s servers, as I’m sure Algolia has some optimized caching in place.

As for Google Maps, you could look into implementing a caching strategy of some sort. Mapbox, for instance, recommends doing so for their API since it can also save on resource and egress fees.

Anyway, I’ll have more to share in a separate post about something I’ve been working on; but for now, I’m excited for native mobile and the big announcement today. :smiley: