Javascript - Using Async - Await With A Foreach Loop - Stack Overflow
Javascript - Using Async - Await With A Foreach Loop - Stack Overflow
Asked
6 years, 2 months ago Modified
1 month ago Viewed
1.5m times
Are there any issues with using async / await in a forEach loop? I'm trying to loop
through an array of files and await on the contents of each file.
2549
import fs from 'fs-promise'
printFiles()
This code does work, but could something go wrong with this? I had someone tell me
that you're not supposed to use async / await in a higher-order function like this, so I
just wanted to ask if there was any issue with this.
Share edited Mar 12, 2021 at 12:19 asked Jun 1, 2016 at 18:55
Sorted by:
Trending sort available
33 Answers
Highest score (default)
1 2 Next
Join Stack Overflow to find the best answer to your technical question, help
Sign up
others answer theirs.
Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It
just fires off multiple asynchronous calls, but the printFiles function does
4627 immediately return after that.
Reading in sequence
If you want to read the files in sequence, you cannot use forEach indeed. Just use a
modern for … of loop instead, in which await will work as expected:
Reading in parallel
If you want to read the files in parallel, you cannot use forEach indeed. Each of the
async callback function calls does return a promise, but you're throwing them away
instead of awaiting them. Just use map instead, and you can await the array of
promises that you'll get with Promise.all :
78 Could you please explain why does for ... of ... work?
– Demonbane
Aug 15, 2016 at
18:04
197 ok i know why... Using Babel will transform async / await to generator function and using
forEach means that each iteration has an individual generator function, which has nothing
to do with the others. so they will be executed independently and has no context of
next() with
Join Stack Overflow others.
to find the Actually, a simple
best answer for()
to your loop also
technical works because
question, help the iterations are
also
others answer in one single generator function.
– Demonbane
Aug 15, 2016 at 19:21
Sign up
theirs.
42 @D b I h b i d i d k ) d h
42 @Demonbane: In short, because it was designed to work :-) await suspends the current
function evaluation, including all control structures. Yes, it is quite similar to generators in
that regard (which is why they are used to polyfill async/await).
– Bergi
Aug 15, 2016 at 23:28
5 @arve0 Not really, an async function is quite different from a Promise executor callback,
but yes the map callback returns a promise in both cases.
– Bergi
Mar 29, 2017 at 16:25
5 @Taurus If you don't intend to await them, then for…of would work equally to forEach .
No, I really mean that paragraph to emphasise that there is no place for .forEach in
modern JS code.
– Bergi
Mar 20, 2018 at 13:24
With ES2018, you are able to greatly simplify all of the above answers to:
Simplified:
2018-09-10: This answer has been getting a lot of attention recently, please see Axel
Rauschmayer's blog post for further information about asynchronous iteration.
Share Improve this answer edited Jun 22 at 17:42 answered Jun 15, 2018 at 11:17
Follow Steffan Cisco
592 9 22 17.7k 4 36 56
9 I don't think this answer address the initial question. for-await-of with a synchronous
iterable (an array in our case) doesn’t cover the case of iterating concurrently an array using
asynchronous operations in each iteration. If I'm not mistaken, using for-await-of with a
Join Stacksynchronous
Overflow to iterable overbest
find the non-promise
answer tovalues
your istechnical
the samequestion,
as using ahelp
plain for-of .
– Antonio Sign up
others answer theirs.Val
Jan 9, 2019 at 10:30
2 How we delegates files array to the fs.readFile here? It tooks from iterable?
– Vadim Shvetsov
Jan 17, 2019 at 13:34
1 Using this solution each iteration would await for the previous, and in case of operation is
making some long calculations or reading a long file it would block the executions of the next,
as opposed to mapping all the functions to promises and waiting for them to complete.
– Rafi Henig
Sep 11, 2019 at 1:07
3 This answer has the same issue as the OP: It accesses all files in parallel. The serialized printing
of results merely hides it.
– jib
Feb 18, 2021 at 13:52
Share Improve this answer edited Mar 3, 2019 at 3:38 answered Mar 26, 2018 at 19:48
Follow Timothy Zorn
2,607 2 19 17
2 This works perfectly, thank you so much. Could you explain what is happening here with
Promise.resolve() and await promise; ?
– parrker9
Mar 28, 2018 at 20:48
2 This is pretty cool. Am I right in thinking the files will be read in order and not all at once?
– GollyJer
Jun 9, 2018 at 0:24
2 @Shay, You mean sequential, not synchronous. This is still asynchronous - if other things are
scheduled, they will run in between the iterations here.
– Timothy Zorn
May 30, 2019 at 16:51
4 If you need the async processes to finish as quickly as possible and you don't care about them
being completed sequentially, try one of the provided solutions with a good amount of upvotes
which uses Promise.all . Example: Promise.all(files.map(async (file) => { /*
code */ }));
– Timothy Zorn
Jan 31, 2020 at 16:03
The p-iteration module on npm implements the Array iteration methods so they can be
used in a very straightforward way with async/await.
52
An example with your case:
Share Improve this answer edited Oct 17, 2019 at 5:59 answered Jul 10, 2017 at 8:15
Follow Antonio Val
user 2,961 1 13 26
Here are some forEachAsync prototypes. Note you'll need to await them:
Join Stack Overflow to find the best answer to your technical question, help
Sign up
others answer theirs.
Note while you may include this in your own code, you should not include this in
libraries you distribute to others (to avoid polluting their globals).
Share Improve this answer edited Apr 26, 2019 at 9:47 answered Mar 22, 2018 at 15:11
Follow mikemaccana Matt
96.9k 87 354 443 1,873 10 19
@Matt, isn't it a problem to await fn in case it wasn't asynchronous? what if the given input
was a synchronous function? stackoverflow.com/a/53113299/18387350
– Normal
Jun 22 at 4:43
43
Background : I was in similar situation last night. I used async function as foreach
argument. The result was un-predictable. When I did testing for my code 3 times, it ran
without issues 2 times and failed 1 time. (something weird)
Finally I got my head around & did some scratch pad testing.
main();
main();
If you are little old school like me, you could simply use the classic for loop, that works
too :)
main();
7 If anyone wondering what vscode theme is that - its is github's official light theme. & If anyone
hurt their eyes with so bright snapshot, my apologies 😅
– krupesh Anadkat
May 19, 2021 at
4:48
I suggest using the phrase 'Before/After Loop' would make it less confusing when it's not a 'For
Each Loop'.
– close
Mar 2 at 11:49
The brother is out here just writing code using Githubs official like an absolute heathen. I'm not
even mad. To each their own. Nonetheless, I would cache the length to speed that for loop
up and prevent recalculations between every iteration.
– User_coder
Apr 20 at 0:03
@Bergi has already gave the answer on how to handle this particular case properly. I'll
not duplicate here.
26
I'd like to address the difference between using forEach and for loop when it comes
to async and await
So, basically callback returns a promise since it's declared with async . Inside
forEach , callback is just called in a normal way, if the callback itself returns a promise,
the javascript engine will not wait it to be resolved or rejected. Instead, it puts the
promise in a job queue, and continues executing the loop.
Basically, when your async callback gets the chance to be executed, the js engine will
pause until fs.readFile(file, 'utf8') to be resolved or rejected, and resume
execution of the async function after fulfillment. So the contents variable store the
actual result from fs.readFile , not a promise . So, console.log(contents) logs out the
file content not a Promise
when we write a generic for of loop, we gain more control than forEach . Let's
refactor printFiles .
When evaluate for loop, we have await promise inside the async function, the
execution will pause until the await promise is settled. So, you can think of that the
files are read one by one in a determined order.
Execute sequentially
Join Stack Overflow to find the best answer to your technical question, help
Sign up
others answer theirs.
Sometimes, we really need the the async functions to be executed in a sequential order.
For example, I have a few new records stored in an array to be saved to database, and I
want them to be saved in sequential order which means first record in array should be
saved first, then second, until last one is saved.
Here is an example:
This solution is also memory-optimized so you can run it on 10,000's of data items and
requests. Some of the other solutions here will crash the server on large data sets.
24
In TypeScript:
How to use?
Share Improve this answer edited Oct 12, 2021 at 11:25 answered Apr 16, 2020 at 17:18
Follow mudamudamuda Oliver Dixon
3 2 6,596 5 54 87
I think it will be helpful if you can complete this example :) in the how to use section. For my
case: await asyncForEach(configuration.groupNames, async (groupName) => { await
AddUsersToGroup(configuration, groupName); })
– Ido Bleicher
Oct 14, 2021 at 8:10
Share Improve this answer edited Feb 1 at 19:34 answered Feb 1 at 19:28
Follow Yilmaz
16.4k 8 85 131
In addition to @Bergi’s answer, I’d like to offer a third alternative. It's very similar to
@Bergi’s 2nd example, but instead of awaiting each readFile individually, you create
16 an array of promises, each which you await at the end.
contents.forEach(console.log);
}
Note that the function passed to .map() does not need to be async , since
fs.readFile returns a Promise object anyway. Therefore promises is an array of
Promise objects, which can be sent to Promise.all() .
In @Bergi’s answer, the console may log file contents in the order they’re read. For
example if a really small file finishes reading before a really large file, it will be logged
first, even if the small file comes after the large file in the files array. However, in my
method above, you are guaranteed the console will log the files in the same order as
the provided array.
Share Improve this answer edited Oct 11, 2019 at 0:57 answered Feb 23, 2018 at 0:47
Follow chharvey
Join Stack Overflow to find the best answer to your technical question, help
others answer theirs. 7,626 6 Sign
55 up84
A simple drop-in solution for replacing a forEach() await loop that is not working is
replacing forEach with map and adding Promise.all( to the beginning.
13
For example:
to
Share Improve this answer edited Feb 10, 2021 at 19:00 answered Jan 27, 2021 at 21:57
Follow yeah22
308 3 11
Not quite the same. Promise.all will run all the promises concurrently. A for loop is meant to be
sequential.
– srmark
Oct 7, 2021 at 11:06
it's pretty painless to pop a couple methods in a file that will handle asynchronous data
in a serialized order and give a more conventional flavour to your code. For example:
11
module.exports = function () {
var self = this;
...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: [] }).save();
var cleanParams = [];
// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, []);
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}
Share Improve this answer edited Sep 26, 2017 at 9:07 answered Sep 22, 2017 at 23:03
Follow Jay Edwards
860 10 19
return fileContents;
}));
Note:
require('fs') compulsorily takes function as 3rd arguments, otherwise throws
error:
Share Improve this answer Follow answered May 26, 2019 at 22:08
master_dodo
1,082 2 17 30
It is not good to call an asynchronous method from a loop. This is because each loop
iteration will be delayed until the entire asynchronous operation completes. That is not
9 very performant. It also averts the advantages of parallelization benefits of
async / await .
A better solution would be to create all promises at once, then get access to the results
using Promise.all() . Otherwise, each successive operation will not start until the
previous one has completed.
10 It is also not good to open thousands of files at once to read them concurrently. One always
has to do an assessment whether a sequential, parallel, or mixed approach is better. Sequential
loops are not fundamentally bad, await actually makes them possible in the first place. Also
they do not "aver the benefits" of asynchronous execution, as you can still run multiple such
loops at once (e.g. two concurrent calls to printFiles ).
– Bergi
Apr 2, 2021 at 15:46
One important caveat is: The await + for .. of method and the forEach + async
way actually have different effect.
8
Having await inside a real for loop will make sure all async calls are executed one by
one. And the forEach + async way will fire off all promises at the same time, which is
faster but sometimes overwhelmed(if you do some DB query or visit some web
services with volume restrictions and do not want to fire 100,000 calls at a time).
You can also use reduce + promise (less elegant) if you do not use async/await and
want to make sure files are read one after another.
Or you can create a forEachAsync to help but basically use the same for loop
underlying.
You can use Array.prototype.reduce in a way that uses an async function. I've shown an
example in my answer: stackoverflow.com/a/49499491/2537258
– Timothy Zorn
Mar 26, 2018
at 19:54
Both the solutions above work, however, Antonio's does the job with less code, here is
how it helped me resolve data from my database, from several different child refs and
7 then pushing them all into an array and resolving it in a promise after all is done:
Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))
Share Improve this answer Follow answered Aug 26, 2017 at 10:47
Hooman Askari
1,355 15 28
7 The parallel reading syntax in the original answer is sometimes confusing and
difficult to read, maybe we can write it in a different approach
files.forEach(file => {
fileReadPromises.push(readAndLogFile(file));
});
await Promise.all(fileReadPromises);
}
For sequential operation, not just for...of, normal for loop will also work
const
Join Stack printFiles
Overflow async
to find = ()answer
the best => { to your technical question, help
const files = await getFilePaths(); Sign up
others answer theirs.
const printContents = await readFilesQueue(files)
return printContents
}
printFiles()
PS
Therefore, the code can simply be designed by that: three separated functions that are
"pure"** and introduce no side effects, process the entire list and can easily be
modified to handle failed cases.
readFiles(files)
Node supports top-level await (this doesn't have a plugin yet, won't have and can be
enabled via harmony flags), it's cool but doesn't solve one problem (strategically I work
only on LTS versions). How to get the files?
Using composition. Given the code, causes to me a sensation that this is inside a
module, so, should have a function to do it. If not, you should use an IIFE to wrap the
role code into an async function creating simple module that's do all for you, or you
can go with the right way, there is, composition.
Join Stack Overflow to find the best answer to your technical question, help
// more complex version with IIFE to a single module Sign up
others answer
(asynctheirs.
(files) => readFiles(await files())(getFilesPath)
Note that the name of variable changes due to semantics. You pass a functor (a
function that can be invoked by another function) and recieves a pointer on memory
that contains the initial block of logic of the application.
But, if's not a module and you need to export the logic?
* by side effect menans any colacteral effect of application that can change the
statate/behaviour or introuce bugs in the application, like IO.
** by "pure", it's in apostrophe since the functions it's not pure and the code can be
converged to a pure version, when there's no console output, only data manipulations.
Aside this, to be pure, you'll need to work with monads that handles the side effect,
that are error prone, and treats that error separately of the application.
Share Improve this answer edited May 3, 2020 at 14:46 answered Dec 21, 2019 at 1:11
Follow lukaswilkeer
163 3 12
Today I came across multiple solutions for this. Running the async await functions in
the forEach Loop. By building the wrapper around we can make this happen.
6
More detailed explanation on how it works internally, for the native forEach and why it
is not able to make a async function call and other details on the various methods are
provided in link here
The multiple ways through which it can be done and they are as follows,
Array.prototype.forEachAsync.js
if(!Array.prototype.forEachAsync) {
Array.prototype.forEachAsync = function (fn){
return new Promise((resolve,reject)=>{
this.forEach(async(item,index,array)=>{
await fn(item,index,array);
if(index === array.length-1){
resolve('done');
}
})
});
};
}
Usage :
require('./Array.prototype.forEachAsync');
let count = 0;
someAPICall = () => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve("done") // or reject('error')
}, 100);
})
}
hello(['', '', '', '']); // hello([]) empty array is also be handled by default
Method 3 :
Using Promise.all
await someAPICall();
count++;
}
for(let i=0;i<items.length;i++){
await someAPICall();
count++;
}
ShareOverflow
Join Stack Improve this answer
to find the best answer
edited Nov to
27,your
2019 technical
at 5:42 question, helpNov 24, 2019 at 20:31
answered
Sign up
othersFollow
answer theirs. PranavKAndro
891 7 9
891 7 9
Your methods 1 and 2 are simply incorrect implementations where Promise.all should have
been used - they do not take any of the many edge cases into account.
– Bergi
Nov 24, 2019 at
21:48
@Bergi: Thanks for the valid comments, Would you please explain me why method 1 and 2 are
incorrect. It also serves the purpose. This works very well. This is to say that all these methods
are possible, based on the situation one can decide on choosing one. I have the running
example for the same.
– PranavKAndro
Nov 25, 2019 at 14:15
It fails on empty arrays, it doesn't have any error handling, and probably more problems. Don't
reinvent the wheel. Just use Promise.all .
– Bergi
Nov 25, 2019 at 15:25
In certain conditions where its not possible it will be helpful. Also error handling is done by
forEach api by default so no issues. Its taken care !
– PranavKAndro
Nov 26, 2019 at 5:57
No, there are no conditions where Promise.all is not possible but async / await is. And
no, forEach absolutely doesn't handle any promise errors.
– Bergi
Nov 26, 2019 at 13:22
Here is a way to read and print each file in series using Array.prototype.forEach
const promises = []
files.forEach((file) => {
promises.push(
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
)
})
await Promise.all(promises)
}
Share Improve this answer Follow answered May 20, 2020 at 20:57
richytong
2,322 9 19
1 The first senario is ideal for loops that needs to be ran in serie and you cant use for of
– Mark Odey
May 29, 2020 at 19:03
Currently the Array.forEach prototype property doesn't support async operations, but
we can create our own poly-fill to meet our needs.
5
// Example of asyncForEach Array poly-fill for NodeJs
// file: asyncForEach.js
// Define asynForEach function
async function asyncForEach(iteratorFunction){
let indexer = 0
for(let data of this){
await iteratorFunction(data, indexer)
indexer++
}
}
// Append it as an Array prototype property
Array.prototype.asyncForEach = asyncForEach
module.exports = {Array}
And that's it! You now have an async forEach method available on any arrays that are
defined after these to operations.
let questions = [
`What's your name`
,`What's your favorite programming language`
,`What's your favorite async function`
]
let responses = {}
Join Stack Overflow to find the best answer to your technical question, help
Sign up
othersWe
answer theirs.
could do the same for some of the other array functions like map...
async function asyncMap(iteratorFunction){
let newMap = []
let indexer = 0
for(let data of this){
newMap[indexer] = await iteratorFunction(data, indexer, this)
indexer++
}
return newMap
}
Array.prototype.asyncMap = asyncMap
... and so on :)
Share Improve this answer edited Apr 26, 2019 at 9:22 answered Mar 12, 2019 at 23:31
Follow mikemaccana Beau
96.9k 87 354 443 71 1 4
To see how that can go wrong, print console.log at the end of the method.
Arbitrary order.
Poor performance.
These are not always wrong but frequently are in standard use cases.
Generally, using forEach will result in all but the last. It'll call each function without
awaiting for the function meaning it tells all of the functions to start then finishes
without waiting for the functions to finish.
printFiles()
This is an example in native JS that will preserve order, prevent the function from
returning prematurely and in theory retain optimal performance.
This will:
With this solution the first file will be shown as soon as it is available without having to
wait for the others to be available first.
It will also be loading all files at the same time rather than having to wait for the first to
finish before the second file read can be started.
The only draw back of this and the original version is that if multiple reads are started
at once then it's more difficult to handle errors on account of having more errors that
can happen at a time.
With versions that read a file at a time then then will stop on a failure without wasting
time trying to read any more files. Even with an elaborate cancellation system it can be
hard to avoid it failing on the first file but reading most of the other files already as
well.
Performance is not always predictable. While many systems will be faster with parallel
file reads some will prefer sequential. Some are dynamic and may shift under load,
optimisations that offer latency do not always yield good throughput under heavy
contention.
There is also no error handling in that example. If something requires them to either all
be successfully shown or not at all it won't do that.
In depth experimentation is recommended with console.log at each stage and fake file
read Overflow
Join Stack solutions (random
to find thedelay
best instead).
answer toAlthough manyquestion,
your technical solutionshelp
appear to do the same
Sign up
othersin simple
answer cases all have subtle differences that take some extra scrutiny to squeeze out.
theirs.
Use this mock to help tell the difference between solutions:
(async () => {
const start = +new Date();
const mock = () => {
return {
fs: {readFile: file => new Promise((resolve, reject) => {
// Instead of this just make three files and try each timing arrangement.
// IE, all same, [100, 200, 300], [300, 200, 100], [100, 300, 200], etc.
const time = Math.round(100 + Math.random() * 4900);
console.log(`Read of ${file} started at ${new Date() - start} and will
take ${time}ms.`)
setTimeout(() => {
// Bonus material here if random reject instead.
console.log(`Read of ${file} finished, resolving promise at ${new
Date() - start}.`);
resolve(file);
}, time);
})},
console: {log: file => console.log(`Console Log of ${file} finished at
${new Date() - start}.`)},
getFilePaths: () => ['A', 'B', 'C', 'D', 'E']
};
};
Share Improve this answer edited Oct 14, 2019 at 19:16 answered Oct 14, 2019 at 18:35
Follow jgmjgm
3,702 1 23 18
Join Stack Overflow to find the best answer to your technical question, help
4 async function printFiles() { Sign up
others answer theirs.
const files = await getFiles();
List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}
// 90% of encodings are utf-8, making that use case super easy is prudent
// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )
PS - I didn't try this code on the console, might have some typos... "straight freestyle,
off the top of the dome!" as the 90s kids would say. :-p
Share Improve this answer edited Apr 3, 2018 at 22:51 answered Feb 28, 2018 at 4:41
Follow Babakness
2,704 3 16 28
Share Improve this answer Follow answered Feb 11, 2021 at 1:45
ChenZeTong
95 6
Join Stack Overflow to find the best answer to your technical question, help
Sign up
others answer theirs.
The OP's orignal question
3
Are there any issues with using async/await in a forEach loop? ...
For example if a really small file finishes reading before a really large file, it will
be logged first, even if the small file comes after the large file in the files array.
2. Possibly opening too many files at once -- A comment by Bergi under another
answer
So let's address these issues showing actual code that is brief and concise, and does
not use third party libraries. Something easily cut, paste, and modifiable.
Reading in parallel (all at once), printing in serial (as early as possible per
file).
The easiest improvement is perform full parallelism as in @Bergi's answer, but make a
small change so that each file is printed as soon as possible while preserving order.
Join Stack Overflow to find the best answer to your technical question, help
Sign up
othersAbove,
answer two
theirs.
seperate branches are run concurrently.
branch 1: Reading in parallel, all at once,
branch 2: Reading in serial to force order, but waiting no longer than necessary
A "concurrecy limit" means that no more than N files will ever being read at the same
time.
Like a store that only allows in so many customers at a time (at least during COVID).
promise transitions from the first to the second state when boot() is called.
The difference now is the never more than concurLimit promises are allowed to run
concurrently.
set : There are promises in a random acces container so that they can be easily
removed once fulfilled. This contianer is used only in branch 1.
bootableProms : These are the smae promises as initally in set , but it is an array
not a set, and the array is never changed. It is used only in branch 2.
Running with a mock fs.readFile that takes times as follows (filename vs. time in ms).
const timeTable = {
"1": 600,
Join Stack"2" : 500, to find the best answer to your technical question, help
Overflow
"3": 400, Sign up
others answer
"4":theirs.
300,
"5": 200,
"6": 100,
};
test run times such as this are seen, showing the concurrency is working --
[1]0--0.601
[2]0--0.502
[3]0.503--0.904
[4]0.608--0.908
[5]0.905--1.105
[6]0.905--1.005
Share Improve this answer edited Jan 17 at 20:47 answered Jan 17 at 20:42
Follow Craig Hicks
1,534 15 30
Think about how forEach works. I can't find the source, but I presume it works
something like this:
Now think about what happens when you do something like this:
No, it won't. Confusingly, that's not how await works. From the docs:
An await splits execution flow, allowing the caller of the async function to
resume execution. After the await defers the continuation of the async
function, execution of subsequent statements ensues. If this await is the last
expression executed by its function execution continues by returning to the
function's caller a pending Promise for completion of the await's function and
resuming execution of that caller.
So if you have the following, the numbers won't be logged before "b" :
main();
Circling back to forEach , forEach is like main and logFile is like logNumbers . main
won't stop just because logNumbers does some await ing, and forEach won't stop just
because logFile does some await ing.
Share Improve this answer Follow answered Dec 23, 2020 at 19:51
Adam Zerner
Join Stack Overflow to find the best answer to your technical question, help
15.8k 15Sign
79up 141
others answer theirs.
Similar to Antonio Val's p-iteration , an alternative npm module is async-af :
function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await
here
const files = getFilePaths();
printFiles();
Alternatively, async-af has a static method (log/logAF) that logs the results of
promises:
function printFiles() {
const files = getFilePaths();
AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}
printFiles();
However, the main advantage of the library is that you can chain asynchronous
methods to do something like:
printFiles();
Join Stack Overflow to find the best answer to your technical question, help
Sign up
others answer theirs.
async-af
Share Improve this answer Follow answered Jun 21, 2018 at 16:55
Scott Rudiger
1,136 11 16
If you'd like to iterate over all elements non-concurrently (e.g. when your mapping
function has side effects or running mapper over all array elements at once would be
too resource costly):
Option A: Promises
Option B: async/await
Share Improve this answer Follow answered Nov 19, 2020 at 10:45
Wojciech Maj
Join Stack Overflow to find the best answer to your technical question, help 902 6 20
Sign up
others answer theirs.
Your option a involves the Promise constructor antipattern.
– Bergi
Nov 19, 2020 at 15:02
If you can't use async/await (IE11, old packer, etc.) then you can try this recursive
function. I used fetch as my asynchronous call, but you could use any function that
0 returns a promise.
fetchOneAtATime(urlsToGet);
function fetchOneAtATime(urls) {
if (urls.length === 0) {
return;
}
fetch(urls[0]).finally(() => fetchOneAtATime(urls.slice(1)));
}
Share Improve this answer edited Oct 27, 2021 at 12:36 answered Oct 27, 2021 at 8:45
Follow Matt Janssen
1,346 11 13
1 Better check urls.length before calling .shift() the first time, and better use urls[0]
and urls.slice(1) instead of emptying the array that is being passed to the function.
– Bergi
Oct 27, 2021 at 8:49
1 Why use finally instead of then ? This will ignore errors, unlike async / await
– Bergi
Oct 27, 2021 at 8:50
This would be if you want to do every fetch, regardless of the success of preceding calls. Good
idea on the empty check and not mutating the array! ✔
– Matt Janssen
Oct 27, 2021 at 12:37
This does not use async/await as the OP requested and only works if you are in the
back-end with NodeJS. Although it still may be helpful for some people, because the
0 example given by OP is to read file contents, and normally you do file reading in the
backend.
const fs = require("fs")
const
Join Stack async =torequire
Overflow find the("async")
best answer to your technical question, help
Sign up
others answer theirs.
const obj = {dev: "/dev.json", test: "/test.json", prod: "/prod.json"}
const configs = {}
Share Improve this answer edited Jan 10 at 11:18 answered Jan 9 at 22:11
Follow João Pimentel Ferreira
12k 7 70 91
OP never requested not to use async / await . They state "I'm trying to loop through an array
of files and await on the contents of each file."
– Bergi
Jan 10 at 0:12
@Bergi I explicitly said the OP didn't request exactly that and it just works with NodeJS.
Although it still may be helpful for some people, because the example given by OP is to read file
contents, and normally you do file reading in the backend.
– João Pimentel Ferreira
Jan 10 at
11:17
Oh, I misinterpreted that phrase as "does (not use async/await) as the OP requested" instead of
"does not (use async/await as the OP requested)"
– Bergi
Jan 10 at 11:23
1 2 Next
Highly active question. Earn 10 reputation (not counting the association bonus) in order to answer
this question. The reputation requirement helps protect this question from spam and non-answer
activity.
Join Stack Overflow to find the best answer to your technical question, help
Sign up
others answer theirs.