[Updated Plugin] ChatGPT with Real-Time Streaming (No more timeouts!)

@nathanielsmithies - Sorry to hear it’s not working as intended :frowning:

Is there error due to an API key or Message History? And is it intermittent or always?

Note - The next release will get rid of the need for the warmup, as it will be triggered client-side, but won’t be setting the keys in the action. So it should be immediate always.

1 Like

@vdev & @aestela - Based on your feedback, discussions about the Lite/Pro idea (see messages above from @georgecollier, @tpolland, and others), and feedback from others who are just getting started with ChatGPT/LLMs , for whom the 20/200 is a bit steep, I’ve decided to reset the prices to 10/mo or 80 once.

Moving forward, I think we’ll have 2 options: the current option with streaming + message history + function calling, etc (the current feature set), and then a Pro version, with built-in web search, vector memory, multiple models, etc. This will probably be branded as “ChatGPT/LLM Toolkit”.

Happy to take feedback here or in DMs about this.

For any one who has purchased a one-time license at $200, thank you very much for the support, and please follow up with me in DMs about either refunds or additional liceneses.

Thanks!

Didn’t think to ask but is it possible to add in the model input: gpt-3.5-turbo-16k or is just gpt-3.5-turbo for this?

1 Like

Yep, 16k is supported.

1 Like

Perfect. Added 4 to test and done a check, bugger said i’m chatgpt-3 model haha

1 Like

lol this doesn’t always seem to work. I’ve heard from someone else who ran into the exact same problem. Apparently playing with the prompt can help with that.

1 Like

Yeah. Topman :slight_smile:

1 Like

Just curious if anyone can confirm what i’m doing is correct.

Added a api to convert pdf to text → works perfect.
I then use this text, if active (toggle) to give the ChatGPT - Send Message w/ Server workflow action a message action of - Current users pdf if active.

Seems to work mostly but sometimes it just gives bad responses to the text.

Anyone figure out a better way with the plugins abilities?

@Timbo - Are you able to see any error messages? My first guess would be that either:

a) There are “illegal” or special characters coming back from the PDF. You could solve this by using “formatted as JSON-safe” on your text input.

b) The Send Message w/ Server action is running before PDF parsing finishes. You can solve this by wrapping the Send Message action within a Custom Event.

1 Like

Just remembered i didnt add find replacce " quotes or add json safe… That tends to throw them off a bit :smiley:

You are THE BEST! Just a question, If we were to start with the “Lite” version and then would want to upgrade to Pro, will there be any discounts? Just wondering and probably something you haven’t talked about.

2 Likes

Ah, great question. I hadn’t thought about it, but happy to offer that. That seems like a win-win. Thanks for the suggestion!

2 Likes

@launchable

Thanks for your thoughtfulness and considerations.

Once again, the idea of having Lite/Pro versions is truly the way to go. my main points is:

Customer Segmentation: Different Apps have different needs. Some may only need the basic functionalities offered in the “Lite” version, while others may need the advanced features offered in the “Pro” version. By offering two versions, you can better meet the needs of different customer segments.

Your plugin has made possible, what we (at No code community) could only dream of.

Wishing you and your Plugin more success.

1 Like

Bought! :moneybag:

1 Like

Awesome! Thanks so much for the support! Let me know if I can help with anything.

Thanks, it’s not showing an error all the time I’ve only seen it once so far.
The error message was empty, so an unknown error I believe.

Ok cool so how will it work in the new release? It won’t prevent any other actions on the page near page load I’m assuming?

Do you think it’s worth me removing the warmup workflow now or wait until the new release?

@nathanielsmithies - I’ll have a better answer tomorrow. Hoping to have the update ready tonight, but will let you know if not.

Important Security Update

Hi all,

I’ve just released a critical security patch: v. 5.17.7.

Please update to this version immediately. A security vulnerability was found and fixed. Based on discussions with users, I believe that only a small number of users, using the plugin configured in a particular way, would have been at risk, but it’s important that everyone update.

For those developers using the vulnerable setup, it is possible that a knowledgeable attacker could have discovered your OpenAI API key.

To ensure your keys are not exposed, you should rotate your API keys immediately. The steps to do this comprehensively are:

  1. Delete your existing API key from the OpenAI dashboard, if you can
  2. Generate a new API key
  3. Update your plugin version
  4. Replace your API key wherever you use it in your Bubble app
  5. Publish a new live version of your app

Note that if you delete any keys that were being used in the OpenAI dashboard, there is no further risk of those keys being used by a third-party. It is highly recommended that you delete your key if you can.

If for any reason you are unable to delete the API key used in your app, perhaps because it is being used elsewhere, you can mitigate the risk with the following steps:

  1. Create a new API key in the OpenAI dashboard
  2. Downgrade your plugin version to the previous version (5.15.5).
  3. Clear your existing API key(s) anywhere you’ve used it/them with the plugin
  4. Upgrade your plugin version to this latest release
  5. Replace your API key with the new key
  6. Publish a new live version of your app

Note that this method will only prevent any future attempts to obtain your keys; if an attacker has already obtained your previous key, and you do not delete it, they will of course still be able to to use it as long as it’s active.

I have discussed this vulnerability with Bubble, and they’ve provided this guidance for risk mitigation and steps to move forward.

As a final measure, please check your API usage amounts in the OpenAI dashboard, for the months since installing the plugin, to see if you detect any anomalies.

==================================

I am very, very sorry that you’re reading this message. Since launching this plugin, this has been one of the main risks I’ve worked to protect against. I know how absolutely critical your OpenAI account is in building a GPT-powered app, and I regret deeply that I may have put that at risk for you, or potentially exposed you to financial loss or account fraud.

I know for some folks this will mark the end of their use of the plugin, which I fully understand. If you would like a refund, please get in touch. If you find higher-than-expected costs in your account and suspect it may be a result of this vulnerability, and would like to discuss remuneration, please get in touch.

My sincerest apologies for the psychological stress and loss of trust this notice may bring you, and of course for any financial cost you may have incurred as a result of my error.

If you have any other questions or concerns, please message me directly, or email me at contact@launchable.ai.

Sincerely,
Korey @ Launchable AI

6 Likes

Hi @launchable, thanks for the heads up.
I did the update + change the API Key but now is not working. I have 2 chatbots and they’re not answering back anymore. Should I do something else besides that?
Thanks.

Hey @joaquintorroba - try updating the plug-in to latest version, refreshing your bubble editor, then putting in your new keys wherever you’re using them. That should be all you need to do.

Let me know if that doesn’t solve it!

1 Like