As long as the LLM is creative enough, it should be able to be conversational, e.g. giving advices based on both his corpus and the retrieved data.

The following plugins support streaming with functions/tool calling, you may choice any of them depending on the LLM. The demo shows also how to send data back from Bubble to the LLM.

This one is a bit apart as it allows to leverage documents stored on Azure, so the LLM answers based on a documentation database:

2 Likes