I’m working on an mobile app and have this 2 different approach that I could choose from. Approach 1. The user’s data can be uploaded when directly from the DB whenever the repeating group makes the call; create, delete,change any data type on the DB at the moment. In this case what would be the impact of many users doing procedures at the same time.
Approach 2. I could upload all the user’s data at the login and save it to a custom state and from that point do all the changes to the data on the custom state but simultaneously sending on a Backend worflow the new data or changes to the DB to be updated. Could this approach lead to a faster UX and dodge any slowdown because peak users?
What would be the difference if any and if it is worth using one approach over the other? Is the usage capacity would work more in one approach versus the other? Or perhaps I’m just wasting my time over complicating things?
Approach 2 is an interesting idea - I wonder if others on the forum have experience implementing that approach!
My first reaction is that approach 2 would indeed help you get a snappier UX. The tradeoff is that data might appear inconsistent for different users (or even for the same user, e.g. if they change pages before the backend workflow saves the data), so you’ll want to be careful juggling the different states and backend workflows to make sure the experience is what you want. Also, note that backend workflows also take capacity.
(In general, I think you’ll find that Approach 1 is easier to think about and to build, since that’s the “normal” approach we have in mind. Tough to say how much you’d be capacity-limited at certain usage thresholds without having a sense of what kinds of operations you’re doing or, frankly, empirically trying it out.)
Yeah I’d like to know this too…
do we have any idea on how much stuff can be saved in states?
Thank you Allen for your respond.
Yes, you are right. I had my app setup both ways to see if would work and on the second approach I had to make sure that the backend checked first if the item been stored hasn’t been created already or to make sure that the one been stored was the one with same ID been bounce around on the custom states. This second approach made faster my front end but more than double my backend workflow.
So if backend workflow would also take capacity this approach would need to be considered only for some cases. I though about uploading everything at the beginning, work with it, and saving it all at the end on the DB, but if the user loses connection and the page refresh it would lose all the data.
Once again Thanks!
I think using the autobinding feature makes the front end faster at saving data as it doesn’t require all data to be saved at once, and it is a save as you go type of experience. If you create the thing first, use that thing as the data source for the page or a container group and utilize autobind on parents thing, you should get a faster experience when users save data.
Also, it would eliminate the loss of data from using custom states and backend workflows. Plus when you create the thing you can add a status such as draft, so if the user closes the browser the data they entered is not lost and they could return to it later.
Another benefit is it eliminates a lot of backend workflows - at least from a perspective of creating the workflows - the autobind still is using capacity to save the data. But my assumption is lots of users at the same time, it would be naturally staggered as not all users will be adding info in the same speed.