I chunked the text returned from pdf.co by sending multiple messages to a thread, then running it once all the text had been sent.
The first message was my prompt, followed by the text. I used a series of custom events that ran on a condition of number of characters in the text returned from pdf.co.
I used truncate and truncate from end to capture the words/characters for each chunk/message .
Its a simple workaround, and not quite the solution i was aiming for but it works. I’m sure there are much more efficient ways to achieve the outcome.
The workflow equips the openai assistant with the information about an insurance policy so when a user asks questions about the policy via a chat feature, the assistant can answer factually with the necessary information.
@robbie4 I’m facing the same feature request for a real estate assistant, I need to upload the project or property information, so the assistant can answer factually.
Can you please share how you managed to set up the workflows?
Thanks!