A super-powerful AI model connected to Bubble?

Hi everyone,

Do you think it’s possible to create a plugin that would use an AI that is able to base its answers on the data of the connected user?

A chat able to dig into the data linked to the user via “Created by” to propose answers and advice adapted to the user?

Curious to hear your thoughts on the subject!

1 Like

Yes. You will have to expose that data via bubble’s Data API, and hook up the the LLM with these APIs as tools, essentially managing the control flow somewhere else.
Or you can try momen.app/ai, basically bubble + AI with native rag / tool invocation.

2 Likes

LLM is not so magic.
Injecting a model schema with data into the AI engine won’t likely lead to accurate results, first of all you will likely go way over the context window of the engine and second of all your datamodel database and fields name will have to very descriptive so the AI engine understand what it is.

As always, instead of expecting magic, I would define the use-cases: what are the most basics data the customer will ask for? You will then have “primitives” data the user may ask single or compound queries on.
Then define functions/tools call to feed those data into the AI engine upon customer requests.
The AI engine will figure out what to call and when depending on the user query.

See Connect OpenAI to Bubble Database? - #2 by redvivi .

2 Likes

I’ve been advised to use Pinecode to transmit information to a vector database, what do you think?

1 Like

Very interesting, so I should define for each request a perimeter of data on which the AI will base its response?

That’s what I would do.

For instance, say you are a car dealer and want a chatbot for customers, I would write those tools/functions matching your DB tables:

  1. Get the all the car makers name
  2. Get all the car models, optionally filtered by specific makers’ name.
  3. Get available car options, optionally filtered by car’s make and models.

With 3 tools/functions, the LLM will be able to answer to any queries related containing what car brands you are selling, what are the models and the available options.

Of course it has to be refined based on your use case.

1 Like

Pinecone would work fine.
Or you can just use momen as it supports vector similarity search out of the box.

1 Like

Thank you for asking this question. I’ve been wondering that myself.

1 Like

Ok thanks, that’s good to know !

Ok, I understand a little better.

I’m using your Microsoft azure OCR plugin to allow my customers to analyze their invoices to extract data and have access to graphs, comparisons, etc…

So with what you’re telling me, if I put in the right filters (invoice supplier, product price, etc.) and list my “data primitives” correctly, I can create an effective LLM.

Are we talking about an LLM that will be able to give advice to the user, or just allow him to access information in a different way (like “ get me data X”)?

Have you developed any plugins that could facilitate this kind of functionality?

1 Like

As long as the LLM is creative enough, it should be able to be conversational, e.g. giving advices based on both his corpus and the retrieved data.

The following plugins support streaming with functions/tool calling, you may choice any of them depending on the LLM. The demo shows also how to send data back from Bubble to the LLM.

This one is a bit apart as it allows to leverage documents stored on Azure, so the LLM answers based on a documentation database:

1 Like