[Code Appreciation] Simple and elegant JavaScript code snippets - parallel requests are successful if they are successful, and asynchronous requests are sequential

Reprinted: https://segmentfault.com/a/1190000023037214

The first successful Promise

Get the first "successful" result from a set of Promise s, and at the same time gain the speed of concurrent execution and the ability of disaster tolerance.

Promise.race does not meet the requirements, because if a Promise is rejected first, the Promise will also be rejected immediately;
Promise.all also doesn't suffice because it waits for all Promises and requires all Promises to resolve successfully.
function firstSuccess(promises){
  return Promise.all(promises.map(p => {
    // If a request fails, count that as a resolution so it will keep
    // waiting for other possible successes. If a request succeeds,
    // treat it as a rejection so Promise.all immediately bails out.
    return p.then(
      val => Promise.reject(val),
      err => Promise.resolve(err)
    );
  })).then(
    // If '.all' resolved, we've just got an array of errors.
    errors => Promise.reject(errors),
    // If '.all' rejected, we've got the result we wanted.
    val => Promise.resolve(val)
  );
}

Reverse the resolve's Promise to reject, reverse the reject's Promise to resolve, and combine them with Promise.all.
In this case, as long as one original Promise is successfully resolve d, it will cause Promise.all to be reject ed immediately, realizing early exit! So ingenious!

Scenarios where this method is suitable:

  • There are many ways to go, any one of them can be passed, and it doesn't matter if some of them fail
  • To speed up results, take multiple paths concurrently and avoid waterfall attempts

Referenced from https://stackoverflow.com/a/3...

Update 28 Aug 2020: It is now natively implemented in newer browsers: Promise.any

Asynchronous reduce

Sometimes business logic requires that we must process multiple data serially, and cannot process multiple data concurrently as above. That is, reduce an array to a single value through a waterfall asynchronous operation. At this time, array.reduce can be used cleverly:

(async () => {
    const data = [1, 2, 3]
    const result = await data.reduce(async (accumP, current, index) => {
      // The subsequent processing has to wait for the previous completion
      const accum = await accumP;
      const next = await apiCall(accum, current);
      return next
    }, 0);
    console.log(result)  // 6

    async function apiCall(a, b) {
        return new Promise((res)=> {
            setTimeout(()=> {res(a+b);}, 300)
        })
    }
})()

Compared with the more common [array.map + Promise.all scheme]:

(async () => {
  const data = [1, 2, 3]
  const result = await Promise.all(
    data.map(async (current, index) => {
      // Processing is concurrent
      return apiCall(current)
    })
  )
  console.log(result)

  async function apiCall(a) {
    return new Promise((res) => {
      setTimeout(() => {
        res(a * 2)
      }, 300)
    })
  }
})()
  • The two solutions are the same: processing is performed on each array item, and the failure of any one of the processing will cause the overall failure
  • The reduce scheme is waterfall, and the map scheme is concurrent
  • reduce scheme, when you process to the nth item, you can use the previous accumulators

Referenced from https://stackoverflow.com/a/4...

Posted by kerplunk on Thu, 19 May 2022 06:37:34 +0300