Data Analysis and BI Tools - HELP!

How do I get my data into any tool that lets me do ad hoc analysis? I’m getting desperate here, there’s doesn’t seem to be any good way to get data into another system so that my team can run their own reports.

The only solution is to build the data metrics into bubble, but that’s so archaic, reminds me of the 1990s. We should be able to push this to some sort of data warehouse or BI tool with all the data table relationships intact.

So you’ve got two options:

  1. Use the Data API to query the data on your own BI or tooling.

  2. Use an analytics solution like posthog to capture the events that are important for later analysis, as they happen.

In either case, there’s alot of posts on the forum about how people take out data from Bubble for analysis, but Bubble specifically doesn’t let you directly connect to the shared Postgres instances. You have to be on a dedicated plan unfortunately to get read only access to a dedicated postgres instance.

What platform does your team use to run reports ?

I’ve wired mine into supabase. A one off exercise initially but now it publishes key data into my analytics repositories where my other warehouse data is. Its also step 1 of getting off Bubble eventually, but for now it works

Trying to hook it up to powerBI, but it seems the data API only allows so many rows at a time for a query?

I think this is the way I’m going to have to go, seems like quite a bit of work to get it all wired up. Makes me want to migrate away from Bubble now, I didn’t realize they had such lock in with their data.

Bubble is a pretty standard postgres database.

The only true limit I can think of is not being able to write SQL inside the Bubble environment as well.

I set up a standalone analytics tool for BI and dashboarding just with the Bubble Data API key. AI wrote the queries to generate the data visualization that I needed. Took about 15 minutes total.

Feels like BI software is so 2010s? :sweat_smile:

A few options that have worked for me or others in the community:

  1. Bubble’s API — you can pull data out via the Data API and feed it into tools like Google Sheets, Airtable, or a data warehouse using something like n8n or Make (Integromat).
  2. Scheduled API workflows — set up a backend workflow that pushes data to an external endpoint on a schedule.
  3. Direct DB access — if you’re on a higher plan, there’s a SQL database connector option worth looking into.

Once the data is in something like BigQuery or even Google Sheets, most BI tools (Looker Studio, Metabase, Power BI) can connect to it pretty cleanly with relationships intact if you set up your schema right on the receiving end.

What BI tool is your team trying to use? That’ll narrow down which extraction method makes the most sense.

Thanks for the make.com suggestion to google sheets, I’ll see if that works just for a basic data pull.

The issue I ran into with supabase is the FK relationships with a 1 to 1 or many to many. Since the field being extracted via the data API is just comma separated values of the unique ID.

For example, an order, has line items on the order, which are then tied to products. Multiple orders could roll up to one invoice, and a payment is tied to an invoice. I’d want to be able to see which products are returned the most by seeing which invoices or payments are returned, and typing it all the way back down to the products.

Is that possible with the data API?

You need to pull your tables from Bubble into your data environment and then do your data engineering, visuals, etc.. We pull all of our Bubble data across multiple apps using the API including looping through pages as necessary. We also run apps with embedded Power BI using app owns data configuration.