Project: Throttled Fetch Service
Background
I created a Redis cache helper with an optional feature: refetching. Inspired by NextJS's Incremental Static Regeneration, it serves cached data immediately, then refetches from the source in the background.
While this works great to ensure data is always up-to-date, it also means that across a fleet of app instances, the same data is being fetched on every request. An orchestration layer is needed to reduce the number of redundant requests across instances.
Solution
The throttled fetch service is a NodeJS service that exposes a simple API to fetch data from a source, with a configurable cache duration. The service is deployed as a single instance, and all app instances are configured to use it as a proxy for fetching data. Using the same cache helper mentioned above, clients can leverage the Throttled Fetch service with only a few lines of configuration. Once fetching is complete, the service persists the data to Redis directly, meaning clients can immediately stop worrying about the request and move on.
Under the hood, a map of functions passed through lodash.throttle
is used. When a request comes into the API endpoint, the service computes a hash of the function name and serialized arguments. This way, the same function with the same arguments will be throttled together, but with different arguments will be throttled separately.