[New Plugin] Supabase.js - With Auth, Data and Storage

Worth investing also, but I think it’ll be combinations of how Supabase streams the data, and how its being handled, can also be connected to Bubble, as the plugin really acts as middleware.

Hi @petersas

Just a quick one, does the full text search functionality support partial inputs?
I can see from the supabase docs we can do that using :* but when I try to pass this into the value it gets url encoded and makes no difference to my end result.

I have tried with each type (plain,phrase,websearch), websearch gives me the best results generally but no partial matches. I appreciate it isn’t a full fuzzy search but having partials would be the closest I can get for now and would be a huge help.

I am wondering if any has recently encountered the issue of data types resetting within the Bubble API connector. This seems to be something new. In the example both the id and created_at fields have reset themselves to invalid data types . They should be number and date respectively. I have had this specific API call in place for at least six months. I am finding the same for all my other API calls to Supabase when I reintialize them. This was not the case previously.

I have been focused on the front end for a few weeks so I just noticed this when trying to add a field to to a separate table and then reintializing the call to pull in the new field.

I am trying to figure out if this is something that has change with Bubble or Supabase.

It is a bit disconcerting for this to have randomly changed. Any ideas would be appreciated.

Will definitely take a look @shaun for next release.

I have for sure. I’ve reinitialized an API call, because I’ve made changes to the table, and every value was text again, and also got a bunch of errors in the editor. Happened multiple times. I guess it’s an issue with the API connector

1 Like

Thanks for the confirmation. This is actually the first time it has happened to me. Its pretty annoying because it looks like it has impacted every call.

@petersas Any update on the values returned from functions. Right now they only return a “function value”. It would be super helpful if they could format the data like the data element.

Some new updates for [2.22.0]

  • Custom headers for Edge functions
  • Edge Function can now return structured Data. When you use Edge functions to query your table and return data, the plugin can now parse it as Bubble objects. This is for both the action within Data element and the individual Edge Function element, altough latter has only the new field for default json response.
    Just need to set the Data type and good to go. In the Data element it will inherit from “Data Type”.
    How the plugin checks simple types and Objects: The plugin expects the returned data from edge functions to be this: {“data”: [array of objects from your database] } and will be pushed to the state Output List, anything else will be treated as simple object and will be pushed to the state Data.
  • Fixed some errors with Large File Uploader
  • Some optimization for Sentry Addon
  • Some fixes for Create Anon User action where it failed to return data.
  • Prepared postgres function to be able to return data types also like Edge function which will be released also soon.

Since we’re calling Edge Functions for the Browser both times, CORS needs to be handled, for those strugling with that here’s a quick cheet sheet to be used within your function, you can adjust this is just a template:

const corsHeaders = {
  'Access-Control-Allow-Origin': '*', // Replace '*' with your domain in production
  'Access-Control-Allow-Methods': 'GET, POST, OPTIONS, PUT, DELETE', 
  'Access-Control-Allow-Headers': 'Authorization, X-Client-Info, Apikey, Content-Type',

Deno.serve(async (req) => {
  if (req.method === 'OPTIONS') {
    return new Response('ok', { headers: { ...corsHeaders }, status: 200 })
  const origin = req.headers.get('Origin')
  const data = "myData"
//  rest of your code  //
  return new Response(
    { headers: { ...corsHeaders, "Content-Type": "application/json" } },

Hi Peter,

I am having an issue with the filter for Column is in an Array filter using the Any Filter. When I try to pass the query string it appears to be constructed properly in bubble.

Screenshot 2024-06-17 142558

However, the output in the console payload shows as the following. It appears that the array is not parsed properly.


I have not had this issue with filters for either Column contains every element in a value or Column is equal to a value. So I have not encountered it otherwise.



Hi, @petersas

I love this plugin, but I have encountered an issue in my newly created app that I cannot resolve.

My app has a single supabasedata element that is watching a table with approximately 1K records. These 1K records are updated every minute by other users, and this supabasedata element has realtime enabled.

When I checked the browser memory immediately after displaying this page, it was consuming 834MB of memory :fearful:.

A few minutes later, I checked the memory again, and the consumption had increased to 2.5GB. :cold_face:

Then, a few seconds later, my browser froze :skull:.

Additionally, this issue occurs not only on the app screen but also on the editor screen…

My Mac’s CPU is an Apple M1 MAX, and it has 64GB of RAM. This app will be used on PCs with lower specs. Is there a way to avoid this phenomenon of excessive memory consumption? I would appreciate any ideas.

Thank you.

1 Like

Hi @petersas, is the event “New Thing Have been created” triggered when using UPSERT? or it is just for INSERT?

Also, the Realtime v2 seems to not work in my case if I have 2 SupabaseData with both Realtime v2 activated?

Hi @petersas,

As and when you get chance could you include the ability to order by foreign table columns please? This would be hugely helpful for us :slight_smile:

Just pushed an update to fix this.

This is a know issue, when loads of data are being exchanged, like thousands of data being updated frequently, but Realtime V2 aims to solve this, but it’s not 110% ready for prod, but you can maybe experiment if it helps

Only for insert, but I can add some different events for those actions too.
Thanks I’ll take a look at Realtime v2 to see the cause.

I think you should be able to do this already. For example when you try to query the columns: *,project!inner(*) you can add project(name) to the Order By field and it should be able to order based on that, or are you thinking about something else?

1 Like

Perfect, I wasn’t aware of the syntax difference between filtering project.name and sorting project(name). Works perfectly, thanks :slight_smile:

1 Like

Thanks for pushing this update. Column is in an Array filter using the Any Filter works perfectly now.

@petersas , Thank you for the expected reply!

I tried using Realtime V2, but unfortunately, the issue was not resolved. I will share some other issues occurring within my application.

In my application, about 1000 records in the Supabase table are subject to Realtime. When placing the SupabaseData element on the page, it generates many “doapicallfromserver” requests during page actions, making the page response terrible.

I hope these will be improved when Realtime V2 is officially released.

1 Like

Do you guys have a strategy to migrate an existing Bubble app to Supabase?

@lindsay_knowcode is pretty far I think.

I have experience with migrating in steps. At this stage, mainly the user table is the only one that we still use. Most data in the app is retrieved from Supabase with a call to a current_user_view. This view gathers all the fields that are used often or are handy to have in hand. Added a group_datasource and in that group another group current_user_data. On page load we check if this group is empty, if so we call the api and populate that group. Next we populate customers states with the same name as all the fields in the current _user_data. This last step is done to make data retrieval more robust. We can reference to those custom states without need to worry about future data structure changes.

Updating data is done via a custom workflow with parameters such as “api_refresh” of type Boolean. So anywhere in the app when needed we call that custom workflow with api_refresh = yes to make sure the api cach is bypassed and a new api call is made to fetch the latest data.

1 Like


How did you get the data in Supabase in the first place?

I was thinking of first setting up a mirror. Meaning, pretty much recreating the Bubble db in Supabase. Use Bubble’s db triggers to fire anytime there’s a change a Bubble row… but then I realized the amount of WUs that this would entail. :scream:

@rico.trevisan I’ve some beta functionality in Plan B that one-click creates the tables and copies the data from Bubble into Supabase. It’s the Data API to get the data out - but everything else is running outside of Bubble (AWS lambda), so it is not WU impacting.

Once the data is all in Supabase, you can do whatever. :slight_smile:

(You could alternatively use Plan B to overcome the 50k Data API constraint to get all the Data in CSV files also - but then you need to create your own Supabase import process)

DM’d you also


Maybe it is cheaper to do a api call to Bubble and fetch all rows? Not sure how much WU this will cost.

Perhaps @lindsay_knowcode can tell the difference between his plan b bubble backup solution and from there to Supabase or using Supabase to fetch the rows from Bubble

1 Like