Scheduling API Workflow on a List Alternatives

I’ve gotten my application to work completely using the Schedule API Workflow on a List function but the problem is if the list is any greater than 150 items, the app crashes. I did try adding the 3 units of capacity for an hour to see how the app behaved and it could handle up to about 400 items but it’s still not nearly efficient enough.

Has anyone figured out a work-around for this? I contemplated finding a third-party service that will take my .CSV, parse the data and then return it to my app via API but I am having a hard time finding a service that can accept a CSV dynamically via URL that has this capability. Do you guys know of anything like this?

I think the problem could be solved by buying like 15 units of capacity, but that’s extremely expensive so an alternative solution is needed during the start-up phase of my project.

It seems that the scheduling itself is what causes the crash, I set the schedule to be Current Date/Time + 60 Seconds to ensure it didn’t start in the background before the scheduling had finished but it takes much longer than 60 seconds and ultimately crashes just for scheduling the workflows. I didn’t think scheduling workflows would be so hard on the system but apparently there’s something at play here that I am not seeing.

I have looked into this, I appreciate the response. The problem is, there isn’t a way that I can tell to hold the parsed data without loading it into the database to perform functions on it with the backend work flows.

This is the kind of solution I need but I can’t seem to figure out how to make it actually work.

If the user uploads the .csv, then I parse it using the plugin, if I try to create a new data point with all of that parsed data just for processing it still crashes… so it’s the same issue.

So what are you trying to do?

My assumption from your last response is that you have a CSV you want a user to upload and to have your app parse that information in some way and then use that information without adding it to your database.

In terms of uploading CSV files you should check the forum on that subject, but in my experience you can only upload 100 rows of data using the CSV upload unless you upgrade the account, but I haven’t looked into it in details as I only use the CSV upload for myself and am fine with 100 rows at a time.

In terms of parsing data you could use backend workflows recursively as long as the data is already saved in the database…which as far as I know, uploading a CSV requires you to save that data in the database to begin with, so you may want to fiddle with the backend recursive workflow. I over the last day have parsed information on about 5,000 records; it stopped after 3,500 and I just triggered the backend workflow again and the remaining 1500 were added with no problem.

1 Like

Take a look at parabola.io To parse and api your csv back to bubble

What I’m doing is trying to create an ability to upload up to 2000 rows at a time. So when I click to run the operation it’s scheduling the backend workflow 2000 times at that exact moment and eventually bogs down and quits.

@gilles I had already known about parabola but I found out that they have a web hook function in Alpha, which works PERFECTLY for this. I just send the parsed JSON data directly to the flow and over the course of an hour it imports all the records to my app via API and backend work flows.

are you passing the csv file to the backend via "schedule an API workflow) and running the csv upload in the backend? I think the max you can upload is 3k rows in the backend. If you try to upload the csv in the front end, it’s been timing out around 1500 rows.

I just uploaded 50k rows to bubble via parabola. I had to split the file into 10k chunks and run 5 separate parabola post APIs to bubble (all simultaneously running). Transfer rate is around 238 rows per minute per api post to bubble. So, I was achieving over 1000 rows per minute. Everything was uploaded, no issues.

Thanks for the parabola webhook note, i’ll check it out.

The flow looks like this:

Upload file On bubble> parse file on bubble using the plug-in > send modified parsed data to parabola as an array via bubble API POST to parabola webhook URL > Parabola flow > API Export to Bubble API Workflow URL > Incoming rows create new data in the database.

Edit:
@gilles I have no idea how far you can push this process but I just tried 500 and it worked like a charm. I would assume at some point your JSON array would become so many characters that the POST function wouldn’t be able to support it any longer. I don’t know that for a fact though.

1 Like

This topic was automatically closed after 70 days. New replies are no longer allowed.