Gracefully Handling Parallel API Requests in JavaScript

There are several approaches for handling concurrent API calls using JavaScript.

One reason you might need to do this is if you need to request data from several different services or use a single service to process different sets of data and you don’t need to wait for all of the requests to complete before you can proceed.

When Sequential Requests is the Wrong Approach

Let’s say you have a banking application and need to make requests to several different services to return a user’s balance.

You have multiple API calls to make. You probably want to show a loading indicator when the requests begin, because you don’t want to show any zero balances before the user sees the result.

If you make each of those requests sequentially, each request will take time. If you have 60 requests and each one takes 2 seconds, that is 2 minutes of wait time before a user would get a response.

You probably don’t want your users staring at a never-ending loading icon.

Who enjoys that?!

So, the wrong approach here would be to use the await keyword inside of a for loop. This would cause the first function to have to finish before moving on to the next. This is inefficient because it takes longer to complete and happens sequentially.

app.get("/api/accounts", async (req, res) => {
  let balances = [];

  const accounts = getAccounts(req.query.accountId);

  for (let i = 0; i < accounts.length; i++) {
    const { data: accounts } = await axios.get(
      "https://somemockapi.com/accounts?id=" + account[i].id
    );

    balances.append(accounts.balance);
  }
  const balance = balances.reduce((pv, cv) => pv + cv, 0);

  return res.send({ balance: balance });
});

Use Promise.all() to Run API Calls Concurrently

Assuming you don’t need the result from before, you can run both functions in parallel using Promise.all().

This will output an array with all of the concurrently run results. You can access the results by going through the elements in the returned array.

Keep in mind that Promise.all() will only give you the results once all requests are done executing in parallel.

app.get("/api/accounts", async (req, res) => {
  let balances = [];

  const accounts = getAccounts(req.query.accountId);

  const requests = accounts.map(account) =>
    axios.get("https://somemockapi.com/accounts?id=" + account.id
  );

  try {
    const result = await Promise.all(requests);

    result.map((item) => {
      balances.append(item.account.balance);
    });

  } catch (err) {
    res.status(500).json({ error: String(err) });
  }

  const balance = balances.reduce((pv, cv) => pv + cv, 0);

  return res.send({ balance: balance });

});

Returning to the previous example, if each 2 second call is made in parallel, the total time for a response is only 2 seconds. Phew!

Throttling Parallel Requests for Rate Limiting

If you are constrained by an API limit then just using Promise.all() will not solve all of your problems.

Instead, consider batching requests manually or use an external library to help.

Consider the following example using the [p-limit](https://www.npmjs.com/package/p-limit)library:

import pLimit from 'p-limit';

const limit = pLimit(6);

const requests = accounts.map(account) =>
    limit(() => axios.get("https://somemockapi.com/accounts?id=" + account.id)
 );

const result = await Promise.all(requests);

// ...

If you need to improve and control batches, look into the [p-queue](https://www.npmjs.com/package/p-queue) library. It has intervals built-in.

Handling Failure in Asynchronous Calls

One thing to be aware of is that if one of the asynchronous calls fails, it can be difficult to determine which one failed and you will not get any results from the other calls.

Therefore, it is important to have a strategy in place for retrying the call if it fails.

Hopefully, this was a simple explanation about how you can utilize some simple code to improve your application speed and performance.

I recently used this approach when working on an application that uses the OpenAI API since there is a limit on how large an input can be. I hope the provided examples are enough to adapt to your needs. Let me know if you have any other unique solutions for handling parallel/concurrent requests in JavaScript.

Enjoyed this article?

Share it with your network to help others discover it

Continue Learning

Discover more articles on similar topics