Initially, I plan to run the Supabase DB in parallel (synced daily but not realtime) and use it mainly for BI queries - run analytics and internal dashboard etc. Once I do the learnings and validation, will start using it as the DB for the app.
Any tips on how to migrate the initial data? I have tested with CSVs. Is it better to export bubble unique ids to Supabase and use those as primary keys? I need to retain the original timestamps for the creation and modification for the data. Am I correct in understanding that if I export the DB when my machine is set to PST, the timestamps will be in PST?
Also, if I understand correctly, it is not possible to export user passwords from bubble as they stored separately? If thats the case, I presume user auth is best retained in bubble (short of forcing a password reset for all users)?
No. For now, I am just doing weekly dumps for BI purposes. I don’t plan to use it as a back-end in the near term. The work involved seems significant and for now, bubble can handle it well. For BI purposes though, it has been a timesaver as retool+supabase combo is much faster for creating queries/reports.
Mainly because its open source and can scale with any code stack. Even before the price change announcement, I had been thinking about scale and if my product does get good traction - how do I ensure that we can scale. Now open-source is more important to me than ever.
With Supabase, we can migrate to any code stack - say React front-end and node or python in the back-end. My guess is that if I was to hire a developer tomorrow, they aren’t going to complain about having to work with Supebase - its a postgres db, everyone is familiar with.
Supabase is fantastic, but it is just the database ( unless you can/want to code Edge functions). If you need codeless backend workflows, you’ll have to look at another service such as Xano or Backendless.