Is Chatgpt hallucinating alot when it comes to bubble?

Hello,
Chatgpt seems to have all the answers when you ask it for help for building with bubble. However most of it seems like hallucination. For instance, I recently asked it how to make repeating group clickable and it talked about a “style section” on properties of the RG. I could not find this section. Its not the first time that Chatgpt seems to make up stuff.
I think the fact that chatgpt is not connected to the internet makes it more unrealible. Anybody else facing this? Any work around this?

Indeed, Chatgpt knows very little about the Bubble editor (and literally nothing about the new responsive engine). Most of what is says about Bubble is just like this - convincing sounding rubbish.

It has a lot of great uses when it comes to building apps - but asking for specific help with the Bubble editor is not one of them.

2 Likes

Somebody just released something that was trained from bubble manual. Check showcase section of forum.

2 Likes

I too noticed that ChatGPT makes up a lot of lies, not just with Bubble but with all topics in general.

The issue is that ChatGPT has a wide breath of knowledge but with lots of missing details. So when you ask it a question, it makes up blatant lies for its missing knowledge. (Kind of like having a picture with a missing corner and trying to paint what you think the missing bit is likely to be.)

When you confront it about its lie, it immediately apologies and makes up an even bigger lie.

I upgraded to ChatGPT plus yesterday (4.0) and I noticed that its more honest and upfront about its missing knowledge compared to ChatGPT 3.5. It’s more likely to tell you which holes are missing rather than hallucinate what those holes are likely to be. (Although still far from perfect.)

From my experience, GPT 3.5 hallucinates very often when it comes to Bubble. GPT4 fares a lot better.

I recommend phind.com since it can search the web.

I’ve had similar experiences when asking it firebase related questions. ChatGPT repeatedly and confidently gave me very wrong but convincing solutions that I would have been able to verify were not possible with a 2 second google trip to stack overflow.

ChatGPT is awesome, it’s just not great at this yet. Maybe the solution someone mentioned that was trained on the bubble manual is better though.

1 Like

Thanks, just checked out showcase. I think its called bubble buddy. Tried a few prompts but not that great either.

phind seems to provide detailed solutions. Will keep using it. Thanks for the suggestions

Yaa totally agree. Our best hope will be with the AI copilot that bubble is apparently working on.

This topic was automatically closed after 70 days. New replies are no longer allowed.