If you’re looking for an all-in-one package to build LLM apps on Bubble, check out ChatGPT/LLM Toolkit. We’ve recently added tons of new features and improvements, and have just added support for 100+ models, so you can use models from OpenAI, Anthropic, Mistral, Meta, Perplexity, etc.
The plugin also includes streaming, features for uploading/parsing files, searching the web, embedding text, vector search, and more!
Just added support for JSON mode. If you need data back as structured JSON, just set this field to “yes”, and make sure to mention it in either your system message or user message.
As a Bubble beginner, I’m kind of struggling to successfully get my Assistant API with the attached file working on Bubble. I’ve watched a couple of your YouTube videos and it might be better to buy this so I save up space etc.
For context, I want to build a Calories In Calories Out calculator (as a starting point) so for example…
I say I’ve had 500g of beef today and the AI will refer to the file (google sheet with the beef nutritional info) and then use that file to say how many calories are in 500g of beef.
This should definitely be doable with the Assistants API. Are you able to tell whether the file is being read at all? What specifically is giving you issues?
For more detailed help, feel free to send me a DM, or come over to our Discord server; we do lots of discussion over there!
Just added support for a Custom Body. This gives you full control over the body of the Chat Completion request, just using standard JSON. Allows for more flexibility for anything missing from the plugin, and works in conjunction with Custom Endpoints, so that you can call arbitrary services (will continue to improve this as well).
Looking forward to your server mini course video(s) as well. Found (coming soon) on github. Keep rolling! Your in depth tutorials have my full attention.
Just published a super fast walkthrough of spinning up your own ChatGPT alternative in 2 minutes.
ChatGPT and the OpenAI API have been down today, which can be a massive pain if you’re using them for work. Thought it might be helpful to see how quickly you can create your own fallback system.
If you use the “Chat Completion (non-streaming) v2” action in a backend workflow (like the screenshot below), you can write the response to the database.
The Bubble editor is admittedly not great for long JSON strings. What I typically do is copy/paste the skeleton into an external text editor, then go back in the Bubbled editor and add the dynamic fields. It’s not ideal, but can work.
Another thing that I find really helpful is using placeholder strings in your prompt, and then running “:find and replace” on them. So for example:
So you could use a JSON structure like this, and run :find and replce on the --question-- string.
It makes it easier to edit it externally.
Just make sure if you’re doing this, and using line-breaks for readability in your editor, that you run “:find and replace” on newline characters (i.e., just click into the find box, press enter, and there will be an empty line at the top). I don’t think “:format as JSON-safe” will catch/fix this.
If you want to save to the database from a streaming action, I’d recommend this:
Create a Conversation datatype in your database. (or call it Chat, Convo, whatever)
Have a field on the datatype that is “list of texts”; call it Messages or something.
Set up an event to trigger on “Message Generation Complete”, and save the “Data Container’s Message History” to the Conversation’s Messages field.
This is covered in more detail in the “Conversations” tutorial here: