I’m using a self-built plugin to get data from an External API. Problem I’m running into is I’m hitting rate limits on importing of an individuals activities.
Here’s what I currently have setup:
- The API will return a number of activities for a request, say up to 200 per request. Since I don’t know how many I’ll get, I run a subsequent get request (using a “Page 2” parameter) if the number of activities is > the 200.
- Since the number of activities is unknown and I want to create a thing for each activity, I kickoff a Scheduled API Workflow on a List which reads the API call, creates a thing, populated based upon the List data.
This works and creates what I expect, but it’s slamming the server I’m hitting. For 137 activities, I’m hitting the API call over 1000 times. Presumably for each data field, it’s going out and fetching the data again, but why wouldn’t it simply be reading the JSON from the original GET request? Do I need the backend workflow to do something different?
Appreciate any help on this!