Hello Bubble Community,
I’m trying to use Bubble’s API Connector to call the streaming API (e.g., for chat completion) of an LLM platform called Dify. I want to utilize the API Connector’s streaming (Server-Sent Events, SSE) functionality, but it doesn’t seem to be working correctly.
Upon investigation, I found several significant differences between the SSE format returned by the Dify API and the format used by the ChatGPT (OpenAI) API, which Bubble seems to support more commonly. I suspect these differences might be the cause of the issue.
Below is a summary of the main discrepancies identified based on example responses from Dify and ChatGPT (referencing dify_response.txt
and chatgpt_response.txt
), along with specific examples:
- JSON Structure in
data:
Field and Text Chunk Retrieval:
-
Difference: Dify returns text chunks in the
answer
key within the JSON object in thedata:
field, whereas ChatGPT often uses thedelta
key. The event type names also differ (agent_message
vs.response.output_text.delta
). -
Dify Example:
event: proxy bTOYc1 data: data: {"event": "agent_message", ... "answer": "Okay, I understand. ..."} [cite: 2] data: data:
-
ChatGPT Exmaple:
event: proxy bTOYc1 data: event: response.output_text.delta data: data: {"type":"response.output_text.delta", ... "delta":"Certainly"} [cite: 12] data: data:
-
Concern: It’s unclear if Bubble’s API Connector can be configured to extract text from Dify’s
answer
field, or if it specifically expects thedelta
structure used by OpenAI.
- Stream Termination Signal:
-
Difference: Dify signals the end of the stream with an
event: message_end
event. The provided ChatGPT log shows anevent: response.completed
event containing final metadata, although the standard OpenAI API often terminates withdata: [DONE]
. These methods differ. -
Dify Example:
event: proxy bTOYc1 data: data: {"event": "message_end", ... "metadata": {"usage": {...}}} [cite: 9] data: data:
-
ChagGPT Example:
event: proxy bTOYc1 data: event: response.completed data: data: {"type":"response.completed","response":{ ... usage ... }} [cite: 117] data: data:
-
Concern: If Bubble specifically expects a signal like
data: [DONE]
, it might not correctly recognize Dify’smessage_end
event, potentially causing issues with stream termination or data handling. -
Raw SSE Formatting (Duplicate
data:
Prefix): -
Difference: The Dify response example shows multiple instances of
data: data: {JSON}
, where thedata:
prefix is duplicated. This deviates from the standard SSE format and could potentially cause parsing issues. The ChatGPT log appears to follow the more standardevent: <name>\ndata: <JSON>\n\n
format. -
Dify Example:
event: proxy bTOYc1 data: data: {"event": "agent_thought", ... } // Double 'data:' prefix [cite: 1] data: data: data: {"event": "agent_mes // Double 'data:' prefix [cite: 1] event: proxy bTOYc1 data: sage", ... "answer": "..."} [cite: 2] data: data: data: {"event": "message_end", ... } // Double 'data:' prefix [cite: 9] data: data:
-
ChagGPT Example:
event: proxy bTOYc1 data: event: response.output_text.delta // Standard event line [cite: 12] data: data: {"type":"response.output_text.delta", ... "delta":"..."} // Standard data line [cite: 12] data: data:
-
Concern: The
data: data:
format is highly likely to cause errors in Bubble’s SSE parser.
Questions:
- Is Bubble’s API Connector streaming feature flexible enough to handle these differences in SSE format (specifically the JSON structure, termination signal, and the
data: data:
formatting)? - Or is it primarily designed assuming an OpenAI-compatible format?
- Are there specific configurations or known workarounds (e.g., using an intermediary server, specific plugins) within Bubble to correctly parse streams from APIs like Dify?
Any advice from those who have encountered similar issues or know potential solutions would be greatly appreciated.