[Updated Plugin] ChatGPT with Real-Time Streaming (No more timeouts!)

@hejtmanekp - glad you’re finding the plugin useful!

I haven’t added support for the “n” parameter yet. The way I’ve been doing multiple response streams is to have multiple Data Container’s on the page. But you’re point about cost savings re: the input prompt is well taken; I hadn’t thought about it from that angle.

I’ll do some more digging and see if there’s a way to support it.

Cheers

Hi all,

Quick note: trying to debug the newest version with a user today, we bumped into issues that seemed linked to Safari. If you’re building/testing, and seeing issues, please try different browsers and let us know what you find. I’m looking into what might be going on.

@launchable thanks

btw, is there pls a way to access the token counts returned by the API?

@hejtmanekp - there is a “Last Request Token Usage” state, that will show the total tokens used for the last request (input + output). I’ll soon also be adding a more fine-grained count, and a “pre-call” count (ie., you can get a token count for input messages, before generating)

1 Like

Any updates on the tutorial video? @launchable

@nocodejordan - yep. Didn’t finish recording yesterday, but should be up this morning.

Still getting no message response errors…

@georgecollier - on v5.11 or 5.12? Also what browser are you using?

5.7.1 threw the TLS connection error, switching to 5.12.5 removes the error popup but just doesn’t return the message

Here’s one that just failed:

Plugin server side action console output

START RequestId: eadba39c-042d-4310-8858-3f8d28fcc449 Version: $LATEST 2023-05-30T13:51:59.144Z eadba39c-042d-4310-8858-3f8d28fcc449 INFO Chat completion processing has started. END RequestId: eadba39c-042d-4310-8858-3f8d28fcc449 REPORT RequestId: eadba39c-042d-4310-8858-3f8d28fcc449 Duration: 509.24 ms Billed Duration: 510 ms Memory Size: 128 MB Max Memory Used: 76 MB

@georgecollier - is that error output from v.12.5?

1 Like

Yes it is

The error rate on message send is currently like 80%

@georgecollier - v5.7 likely has various bugs lingering. Currently 5.11.1 is the most stable. Can you try upgrading to that and see how things look? (downgrading, I suppose, if you’re on 5.12)

Also, are you using Safari?

Chrome

5.11.1

1 Like

@georgecollier - just DMed you with meeting invitation

For reference re:: above issue ^^

If anyone else is seeing this on v5.11.1, check if you are using Ensure Connected before your send message event, and if you are, temporarily remove it, see if that fixes things.

p.s. - thanks @georgecollier for the help in quickly debugging :bowing_man:

Is there a way to limit the amount of tokens used in the response from ChatGPT in this plugin? I thought I saw something about token monitoring, but I didn’t think I saw anything specific on how to limit them?

@gulbranson.nils - Yep, there’s a Max Response Tokens parameter in the Send Message actions

1 Like