No More LRU: Simple Scalable Caching with Only FIFO Queues
- Caching is used in almost every component of today's storage systems to speed up data access and reduce data movement. The most important component of a cache is the eviction algorithm that decides which objects to keep in the very limited cache space. While Least-Recently-Used (LRU) is the most common eviction algorithm, it suffers from many problems.
- First, LRU is not scalable. LRU maintains objects in last-access order, which requires a doubly-linked list.