Strategy | Description | Example Use Case |
---|---|---|
Cache-aside (Lazy Loading) | App checks cache first. If a miss, it loads from DB and inserts into cache. | Read-heavy systems like user profile lookups |
Write-through | Writes go to cache and DB at the same time. | Product catalog where consistency is important |
Write-behind (Write-back) | Writes go to cache, and are written to DB asynchronously. | Metrics/log data ingestion |
Read-through | Cache itself knows how to fetch from DB on a miss. | Abstracted cache layers (e.g., using a cache proxy) |
Refresh-ahead | Cache preemptively refreshes soon-to-expire keys. | Time-sensitive data like exchange rates |
Memcached
is generally used in this manner.
Subsequent reads of data added to cache are fast. Cache-aside is also referred to as lazy loading
. Only requested data
is cached, which avoids filling up the cache with data that isn’t requested.
Disadvantage(s): cache-aside
Each cache miss results in three trips, which can cause a noticeable delay.
Data can become stale if it is updated in the database. This issue is mitigated by setting a time-to-live
(TTL) which
forces an update of the cache entry, or by using write-through.
When a node fails, it is replaced by a new, empty node, increasing latency.