For general speed and flexibility, would it be better to have a database set up like option A:
- 50 more granular custom data types
- fewer fields per custom data type
(some data duplication between data types is likely - for example the same image might be uploaded on multiple records in different data types)
- ~100 entries/records per day, per custom data type over time
Or option B like this:
- 5 very broad custom data types
- more fields per custom data type
(to allow subcategorization & relationships/linking with other data types)
- 1,000 entries/records per day, per custom data type over time
Considering performance over time, which option would you go with? Option A would result in about 40k records per year in 50 different data types, whereas option B would result in 1.8m records per year in 5 data types.
In non-Bubble data systems, I have previously developed using option B and using database relationships and filters to get the data I need. This was because the slowest part of other systems I have used has been the display component – the database backend was lightning-fast.
However, the Bubble tutorials and examples I’ve come across all seem to go with option A and my gut says to go with A in Bubble to avoid having to filter a large dataset.
What do you think? Thanks for helping me clear this up!