@gregsheen23, @mitchbaylis and @guillaume.raballand offer some good ideas on how this can be achieved.

I’ll offer some other method that is optimal.

Firstly, don’t both with the schedule api workflow to save the values to the database. Instead make sure your key names in the json are exact match of the data type names in the database data type you wish to create and simply take the list of text from json manipulator output and send those values through an API call to your app for a bulk create operation.

Screen Shot 2025-01-05 at 11.52.13 AM

So take the json mainpulator list of text (that is json formatted) and then add operator to join with arbitrary text which is just a new line (ie: press return on keyboard)…in my screen shot the find and replace is for my use case, so not of relevance for yours.

This method allows you to create as many data entries at once as required based on the number of outputs from the JSON manipulator.

So that is how you can create the items in your database.

But, I’d like to discuss the setup of processing text. You do not need an API call to ChatGPT to do the extraction and JSON creation. Instead you should have a server script that does this for you. I’ve used chatGPT to help me create server scripts for this purpose and it works well. If you did this properly you’ll eliminate steps 2, 3, and 4 from the process csv workflow series and replace it with a step 2 that is run server script and a step 3 which is run API call to your app to create the database entries. You might even get away with eliminating step 1 as well since it might not be essential.

This will also it everything is done as per my suggestions, eliminate all steps in process_experience workflow.

If you follow my suggestion, you will have basically only two steps, 1 is server script to process the csv and the second is api call to your bubble app to bulk create the data entries. Based on what I can see in screen shots, this will reduce the entire process from 9 steps to 2, saving on average at least 3.5 WUs per run if each run only processes one data entry. If the process were to run 2 data entries the savings would be around 5.5 WUs…

If you are already familiar enough with chatGPT to get it to process the CSV via API, you should be capable enough to get chatGPT to create the server script for you…if not, you will still at least benefit from sending the json list as an API call to your app to bulk create rather than attempting to send it in via schedule backend workflow on a list.