javascript - Promise.all with too many fetches at a time freezes server
I bundle 115 fetch requests into a Promise.all in order to load mandatory resources. The problem is that they all fire at once and - depending on the server I test it on - either freeze the script entirely or give off a 500 error code.
I implemented a time delay in these requests so that they don't fire all at once, the minimum I manage to set being 50ms which adds up to 5.75 seconds of loading time.
- should I create a new API endpoint which bundles these requests? (I'd rather not to be honest, caching each request separately after they loaded is a huge bonus)
- is there a way to use one connection for all these requests so that it doesn't look like many individual requests to the server?
- is there a way to make the server wait instead of handling all these requests at once?
I am also curious to know how the browser handles this problem since a website can easily want to load more that 100 different resources at a time. I'd love for the browser to handle my many requests in a 'waterfall' manner similar to what's shown in Chrome's web developer tools.
Note that I do NOT want to wait for each fetch request to completely finish before starting another, I just want the requests not to be sent at the same time.
I must mention that I am using a PHP API on an Apache server and a mySQL data base.
let offset = 100
let delay = 0
let requests = ['apiRequest0','apiRequest1','apiRequest2','apiRequest3'...]
let data = {}
await (async() => {
promises = []
requests.forEach(function(item){
promises.push(
(async() => {
await new Promise(function(resolve) { setTimeout(resolve, delay); });
data[item] = await fetch(`/api/${item}`).then(response => response.json())
})()
)
delay += offset
})
return await Promise.all(promises)
})();
- I tried using keep alive in the fetch requests - no changes
- I tried setting execution time and memory limit to higher values - no changes
- I tried adding a delay, which works, but I'd love to have a faster and more reliable solution instead of guessing what the server can cope with
Answer
Solution:
Following the consensus I merged my 115 requests down to 5 with new API endpoints, which solved my problem.
Thanks a lot!
Source