I’ve designed a chatbot with the appearance of chatgpt, where the oldest messages go on top of the chat and the newest ones go on the bottom. When the conversation is long enough so the scroll is neccessary (after 3-4 messages), when the AI starts streaming, there’s an autoscroll to the top for no reason, no matter in what position the user is. It can be corrected if I set a scroll just after the streaming starts, but for some miliseconds, you can see that there’s something strange because it goes to the top and them to the bottom to the last message.
Has someone experienced this? Could it be solved please?
Hey @angel1996 well it depends if you want to build out the functionality yourself or not.
If you want to use a plugin that does this for you, fetches LLM responses and more then your best bet is using this plugin New LLM streaming plugin!.
Otherwise should you want to build out this functionality yourself, you might want to create a custom event that scrolls to the bottom of the repeating group every x number of microseconds ‘while streaming = yes’
This functionality should be pretty straightforward to implement depending on your level of experience. Let me know if you need more help.
I tried your suggestion but I keep having the same problem. In the same page, I created a parallel workflow “do every X seconds” and forced the scroll while the condition is true. I’ve tried different values from 0.1 s to 1 s, but it’s not activated. I also inserted a text in the UI to see if the custom state is working and yes, I can see that custom state is yes while it’s streaming but nothing is happening, it’s not being activated for some reason…