[FREE PLUGIN] AI Proxy - ChatGPT, OpenAI, Claude, LlaMA, PaLM Real-Time Streaming

Update tutorial: AI Proxy Tutorial - Descript

1 Like

Update: AI Proxy - ChatGPT Streaming - Sample editor has new examples

1 Like

Update: Added function calling

new version is live on github: https://github.com/coalias/ai-proxy-custom/

1 Like

not sure if the repo has been updated, latest git commits are from Aug 13, 2023 :thinking:

@gaimed I think your reply saying “already done” was to me, but I have the newest version of the plugin installed and the System Message and the AI Model still don’t allow for dynamic input.

Pushed the changes :slight_smile:

1 Like

Added new update with dynamic system message. For the model, you can pass a modelname to “Custom model name” in the element.

3 Likes

I’m really struggling to get this to work. Trying to figure out your youtube video instructions but I feel like I’m missing something because nothing happens when I start the workflow. Could you create a step-by-step that explains each piece of the process?

Did you check the samples: AI Proxy - ChatGPT Streaming - Sample

and the editor: ai-proxy-sample | Bubble Editor

And then the tutorial: AI Proxy Tutorial - Descript

1 Like

@gaimed yes I did. I just finished the tutorial via descript (several times) and you’re speeding through very quickly, and missing a lot of the important details—easy to do when you already know how to do everything.

I just revisited the editor and realized why I couldn’t see the elements—they’re reusable elements. I can see them now. Maybe that will help.

Still, a detailed step-by-step would be extremely helpful.

1 Like

Ill see if I can find the time to do it

Thanks for creating this plugin!
Will there be issues with the servers being down if too many apps are using this plugin? There’s another similar plugin with the same issue.
If this would be an issue are we able host the streaming on our servers?

Thanks

Hi there,

Don’t know about other plugins but this plugin is using cloudflare workers. I have a 100% uptime with https://coalias.com/ for the same infrastructure.

But you can always host it yourself using cloudflare!

Regards,

Ab

Added 4 models, now the plugin supports the following models (streaming):

openai/gpt-3.5-turbo
openai/gpt-3.5-turbo-16k
openai/gpt-4
openai/gpt-4-32k
anthropic/claude-2
anthropic/claude-instant-v1
google/palm-2-chat-bison
google/palm-2-codechat-bison
meta-llama/llama-2-13b-chat
meta-llama/llama-2-70b-chat
nousresearch/nous-hermes-llama2-13b
mancer/weaver
gryphe/mythomax-L2-13b

1 Like

We processed around 100k AI requests since launch and 100% uptime.

Would love to see your reviews for this free plugin!

3 Likes

Added new models to OpenRouter including one thats 100% free to use (pygmalionai/mythalion-13b):

openai/gpt-3.5-turbo,openai/gpt-3.5-turbo-16k,openai/gpt-4,openai/gpt-4-32k,anthropic/claude-2,anthropic/claude-instant-v1,google/palm-2-chat-bison,google/palm-2-codechat-bison,meta-llama/llama-2-13b-chat,meta-llama/llama-2-70b-chat,nousresearch/nous-hermes-llama2-13b,mancer/weaver,gryphe/mythomax-L2-13b,jondurbin/airoboros-l2-70b-2.1,undi95/remm-slerp-l2-13b,pygmalionai/mythalion-13b

We processed 287k ai requests in the last 7 days with 0% errors!

The uptime has been 100%: AI Proxy

I would love reviews for those who find the plugin beneficial:

3 Likes

Done! Was easy to follow and replicate the Demo model you set up.

Thanks Ab!

Can anyone give me a lead on how we fetch the token count sent and token count received? I’m probably just glancing over something obvious
:bar_chart: Token Counter: Monitor your usage transparently.

Hi, I had to remove this. Will add it back soon!

1 Like