[New] Template for the OpenSource Mistral.ai API

This is the first Bubble template for the Mistral-AI API. Bringing open AI models to the frontier | Mistral AI | Open-weight models
This template connects your Bubble app to the Mistral-AI API. The Mistral-AI is Open Source and based in Europe.

The Mistral AI template introduces a new way to build personal Mistral AI driven assistants. This template gives you all you’ll ever need to create powerful assistants.

- Create an Assistant or choose one from the assistants dropdown.
- Create a new thread or select an existing thread.
- Create new messages in the active thread.

You can create unlimeted assistants. Then you can add unlimited threads to an assistant. And every thread can have unlimited messages. Messages consists of prompts from you and answers from the Mistral AI.

To work with the template you need an MistralAI developer account. https://docs.mistral.ai/
Create account and click on API keys on the left side. Create an API key and copy it. Now go to your Bubble site and install the API connector plugin.
Expand the OpenAI API call. On the top you’ll find a shared header. Paste your API key in the value field of the Authorization key.

2 Likes

@mdwp.post looks good!

I’m curious - does that use their Chat API? or do they have an Assistants structure like OpenAI that I’m not aware of? I can’t see anything related to Assistants in their doc

The Mistral.AI doesn’t have an Assistant API. I mimic the OpenAI Assistant-API with Bubble workflows and a DB structure with Assistants, Threads and Messages.

1 Like

I’ve changed the model to the brand new large model.

NEW BIG CHANGES: This template now connects your Bubble app to the Groq LPU Inference Engine. It works with the LLMs from Google, Meta and Mistral.

This is what Groq says about their LIGHTNING FAST engine:
"Welcome to Groq®! We created the LPU™ Inference Engine - the first and fastest of its kind - serving the real-time AI market. Our solution for inference (not training) makes us the AI performance leader, in regards to speed and precision, in the compute center.

Unlike other providers, we aren’t brokering a cloud service. We built our own chip, compiler and software, systems, and GroqCloud™. Our first-gen GroqChip™, a Language Processing Unit™ (LPU), is a new processor category. That’s one part of our secret sauce.

Our performance enables a greater potential for AI across multiple industries. It’s about real-time AI, low-latency, low batch size solutions. Try GroqChat yourself!"

To work with the template you need an Groq developer account. https://groq.com
Create an account and click on API keys on the left side. Create an API key and copy it. Now go to your Bubble site and install the API connector plugin. Expand the API call. On the top you’ll find a shared header. Paste your API key in the value field of the Authorization key.

I’ve created a better UI/UX.

Meta Llama 3 is out and available on the rapid Groq inference engine.
I’ve added Llama3 as an option to this templates

Most if not all LLMs return text formated as markdown. Therefore this template now uses a wonderful free plugin which converts markdown to bb-code.

I’ve done a lot of Responsive Design fixes. Now you or your users have an AI-Assistant to go.

NEW: Now you can also use the Claude Sonnet 3.5 Model from Anthropic for your assistant. If you want to use Claude you can get an API key here:

Now with thr brand new llama-3.1-8b-instant model from Meta.
Insane performance!
Look at this video: Groq AI-Assistant - 25 July 2024 | Loom

NEW: you can create longer prompts with the Speech to Text element. It uses the new real time Speech model from Groq. Because it’s a reusable Bubble element you can use it also on other pages or Bubble apps for other use cases.

Now with the new and faster whisper-large-v3-turbo for speech to text.