Importing data from API best practices

Hello, I am hitting some possible limitations / roadblocks and wondering what the best practices are for using an external API to insert data into Bubble.

I am currently using a backend API workflow that has a “Schedule API workflow on a list” which will make the API call as well as reference a different API workflow that uses a “Create thing” to insert each thing into the database. This worked okay but is quite slow and ended up causing some data integrity issues when trying to tie a field from one type to a field in another type.

I also tried taking the API data and again using a “Schedule API workflow on a list” but instead of inserting into my table and doing any data manipulation in the API call it is going into a processing table raw. After all the records are in the processing table, there is a workflow to delete all records that are a certain type but for some reason it only deletes about half. That workflow then uses an iterator to make an API call referencing the new records, insert data into different table, and uses the Result of that step to populate items in my “processing” table however this only works for about half of the items as well.

Should I be using the “Data API” to import data from an external API? Are there other ways to go about it so it operates quicker and the following workflows will be able to execute on the whole table of items and not just half? I am working with about ~80 records of test data and experiencing the above issues and beginning to think Bubble may not be an appropriate tool for my use case.