My app is B2C and I have all sorts of analysis Im doing in my admin pages. I build them entirely inside bubble and its going well but there are 2 downsides:
Its expensive - It uses a lot of WU to do some things such as Cohorot Analysis
Its slow - if I need to download a lot of data that it takes time
Im trying to optimize these processes by downloading all of the data at once and then just filtering it on the front end without asking for more data from the server
So i wonder - How are do other people in the community approch this? Do you download your data outside of bubble and analyze it there? any other ideas?
From a general stand point the approach to this will depend upon a few things.
How much data do you have?
How is the data structured?
What are you looking to do with it?
Plus what tool like Amplitude can you use to gather data on your behalf.
My approach has been to do it all in a separate admin dashboard SPA, and start mining the data there. If you data is well normalised you can write efficient queries to get the information you need. Alternatively I’ve been using database triggers to denormalise data to new tables which are easier to work with.
Many of the challenge arise through not being able to link tables in a way you might with SQL. So taking it outside can make a lot of sense. It just really depends what your aims and objectives are, as well as the answers to the rhetorical questions above.
sometimes I just use the editor database…i’ve been doing this for some clients to save on dev costs, so an email is sent out with direct link to the correct data type and unique ID pre-populated in the search when they need to see a specific piece of data. Can do the same thing for more than one specific entry if needed.
Other times, I just build out an Admin dashboard with all the bells and whistles and let client know the more they look at it, the more it costs, so as to advise them to only look when needed (like monthly)
I’ll try to be more specific - Im trying to understand if doing an analysis such as Cohorot Analysis would make since when my db is 10X or 100X times bigger than now (because calculation is aleady a bit slow)
Further details:
I have 2 data tables: Users (couple of thousands) & Session (tens of thousands)
The data is structured: Each session is assigned to a specific user and has a date
I want to create a Cohorot Analysis where all users are grouped by their week of creation and than to see the following weeks after their creation with % of how many people of that same cohort were active during this week
To do this analysis Im searching for each user to see if they had a session during a specific week. If they had at least one session during that week I consider them as active for that week
At the moment id does work for me but its a bit slow and costly. This is how I build it:
Main RG (Rows) for weeks since 01/01/23 - each row holds the entire list of users that were created during this week
Inner RG (horizontal) for weeks that comes after the first week in the main RG
Inside each and every week cell (as seen in the image) i calculate how many users from this row’s cohort were active and divide it by the entire cohort user count
It does work but cost a lot of WU and is rather slow as well
I have an idea to better solve it but it raises a question:
I think about just downloading ALL users and ALL sessions to begin with upon page load and than just filtering it on the front end using Advnaced filters - But would this option be feasable when I have 10X or 100X bigger database?
Here’s how I would do this. Have a last logged in field for User.
When building your report just load those values either using a Data API call or have the last log in as a different datatype.
To save on WU when updating the last login, since you are only tracking the week they logged in, your front end WF can be something like:
When user is logged in and last login value is less than current date:rounded down to week > update value.
Another alternative is to create a weekly report datatype. Add a user to a list field of the report if they logged in during a week. Be wary if you know each list can get large in datasize. Break down the weekly reports into parts to manage sizes.
I always have a log data type which has content, the person who did it, when, the type of log (option set).
That makes it dead easy to view a bar graph of logs for a certain action vs date or for a certain user.
Product analytics are best kept outside of Bubble 1. because tracking is cheaper and 2. they are purpose designed tools for it.
That said, I think that all Bubble apps that are products MUST have an admin dashboard with KPIs. The client needs to define these KPIs so they can measure their project success, and see that in real-time in their app.
Would you create a log data type for a b2c app as well?
I hope to have 10x or 100x more users in my app and im afreaid of the WU price of creating such log
Try using mixpanel to gather analytics. It’s easy to implement and you can get a lot done on the free version initially.
Ideally i would recommend having some sort of a log in bubble as well. But complete analytics should be handled by Google analytics and mixpanel/amplitude.
My recommendation is admin dashboards should only have dashboards related to database and userbase.