javascript – Calling a paginated and rate limited REST API with a cursor value to get multiple pages of results using .fetch

I’m working with a paginated REST API that returns data from a GET request with the following structure:

{
  "data": (
           { "id": "26007494656", "user_id": "23161357"}, 
           { "id": "26007492556", "user_id": "23124357"}...
          ),
  "pagination": {
        "cursor": "eyJiIjpudWxsLCJhIjp7Ik9mZnNldCI6MjB9fQ=="
    }
}

The endpoints can return a maximum of 100 elements inside a data array per call, and are rate limited to 800 calls per minute. However, there can be at anytime 100,000 – 200,000 available data elements to retrieve. Furthermore, the pagination cursor value is needed for subsequent requests to mark the starting point for the next set of results, and this value is updated on every successive request.

While it won’t likely be possible to retrieve all the data available from this endpoint, I have approached this problem using recursion:

function getAllData (cursor, data = (), counter = 35) {
  while (counter !== 0) {
    const request = new Request(url + (cursor ? '&after=' + cursor : ''), { 
    method: 'GET' ,
    headers: {
      'Client-ID': clientId,
      'Authorization': `Bearer ${access_token}`,
      'Content-Type' : 'application/x-www-form-urlencoded; charset=UTF-8'
      }
    });
      return fetch(request).then((response) => response.json()).then((responseJson) => { 
        if (counter === 1) return data;
        data.push(...responseJson.data);
        return getAllData(responseJson.pagination.cursor, data, --counter);
    }).catch(showError);
  }
}

I’m wondering if there is a better way to approach working with this API with the goal of retrieving as much data as reasonably possible, or if this function can be optimized somehow. The counter argument specifies the number of times the recursive call should be made, and the catch block simply displays an error message to the user. I’m using the maximum return value of 100 objects per call so it currently returns an array of approximately 3500 data elements.

With the way this function is being utilized in my app, it could be called as many as 4-5 times in a minute, and it seems to take approximately 6-7 seconds for the function to finish its recursive calls and return data. Currently, I’m using this function in a client-side Javascript file, but I’m thinking about maybe moving some of this logic to an Express server.