Yes. You will have to expose that data via bubble’s Data API, and hook up the the LLM with these APIs as tools, essentially managing the control flow somewhere else.
Or you can try momen.app/ai, basically bubble + AI with native rag / tool invocation.
LLM is not so magic.
Injecting a model schema with data into the AI engine won’t likely lead to accurate results, first of all you will likely go way over the context window of the engine and second of all your datamodel database and fields name will have to very descriptive so the AI engine understand what it is.
As always, instead of expecting magic, I would define the use-cases: what are the most basics data the customer will ask for? You will then have “primitives” data the user may ask single or compound queries on.
Then define functions/tools call to feed those data into the AI engine upon customer requests.
The AI engine will figure out what to call and when depending on the user query.
For instance, say you are a car dealer and want a chatbot for customers, I would write those tools/functions matching your DB tables:
Get the all the car makers name
Get all the car models, optionally filtered by specific makers’ name.
Get available car options, optionally filtered by car’s make and models.
With 3 tools/functions, the LLM will be able to answer to any queries related containing what car brands you are selling, what are the models and the available options.
Of course it has to be refined based on your use case.
I’m using your Microsoft azure OCR plugin to allow my customers to analyze their invoices to extract data and have access to graphs, comparisons, etc…
So with what you’re telling me, if I put in the right filters (invoice supplier, product price, etc.) and list my “data primitives” correctly, I can create an effective LLM.
Are we talking about an LLM that will be able to give advice to the user, or just allow him to access information in a different way (like “ get me data X”)?
Have you developed any plugins that could facilitate this kind of functionality?
As long as the LLM is creative enough, it should be able to be conversational, e.g. giving advices based on both his corpus and the retrieved data.
The following plugins support streaming with functions/tool calling, you may choice any of them depending on the LLM. The demo shows also how to send data back from Bubble to the LLM.