-
Notifications
You must be signed in to change notification settings - Fork 3
Description
Let's say you have a request handler that goes off and grabs data from somewhere like a DB or upstream API.
If you have a client hit that handler at the moment the cache middleware will wait for the response to be built then cache it and return.
If you have two clients request the same resource, you end up running two requests and waiting for them both to respond, because they both calle next(), essentially doubling the upstream requests until the cache is filled by the last request and then future requests are served by cache.
It would be great to add a request queuing system using an event emitter so the first request checks for a cached response and finds none, so before calling next() it adds a response to the cache with a status of "pending" which it then updates to "ready" when the request is fulfilled and the cache data can be updated. (Detect non 2xx and 3xx response codes and delete this if the request fails).
In the mean time, any further requests to the endpoint also check for a cached response and find one in a "pending" state. Instead of calling next() to hand the request to the next handler, they hook an event emitter and listen for an update using the cacheKey as the event to listen for.
The original request that called next() finally completes and can fire an event saying that the cache for the cacheKey is now ready, all other waiting requests now access the cached data and return it.