Skip to content

Latest commit

 

History

History
120 lines (84 loc) · 4 KB

File metadata and controls

120 lines (84 loc) · 4 KB

LRU Cache (Least Recently Used Cache)

Category

Uncategorized / Data Structures

Description

An LRU (Least Recently Used) Cache is a data structure that stores a limited number of items and automatically evicts the least recently used item when the cache reaches its capacity. It's one of the most commonly used cache replacement policies in computer systems.

How It Works

The LRU Cache maintains items in order of their usage:

  1. Most Recently Used (MRU): Items at the front of the cache
  2. Least Recently Used (LRU): Items at the back of the cache

Operations

GET Operation

  • Retrieves the value associated with a key
  • If the key exists, it's moved to the front (marked as most recently used)
  • Returns -1 if the key doesn't exist
  • Time Complexity: O(1)

PUT Operation

  • Inserts a new key-value pair or updates an existing one
  • The item is placed at the front (most recently used position)
  • If the cache is at capacity, the least recently used item is evicted
  • Time Complexity: O(1)

Implementation Details

This implementation uses two data structures:

  1. Circular Doubly-Linked List: Maintains the order of items based on usage

    • Head points to the most recently used item
    • Tail (head.prev) points to the least recently used item
    • Allows O(1) insertion and deletion
  2. Hash Map: Provides O(1) lookup of nodes by key

    • Maps keys to their corresponding nodes in the linked list

Algorithm Complexity

Operation Time Complexity Space Complexity
GET O(1) O(capacity)
PUT O(1) O(capacity)

Real-World Applications

LRU Cache is widely used in:

  • Operating Systems: Page replacement in virtual memory
  • Web Browsers: Browser cache management
  • Databases: Query result caching
  • CDNs: Content delivery optimization
  • Web Servers: Page caching
  • APIs: Response caching to reduce load

Example Walkthrough

Cache Capacity: 3

1. PUT(1, 100)  →  [1:100]
2. PUT(2, 200)  →  [2:200, 1:100]
3. PUT(3, 300)  →  [3:300, 2:200, 1:100]
4. GET(1)       →  [1:100, 3:300, 2:200]  (1 moved to front)
5. PUT(4, 400)  →  [4:400, 1:100, 3:300]  (2 evicted - LRU)
6. GET(2)       →  -1 (not found - was evicted)

Key Observations

  1. Recency Matters: Both GET and PUT operations update an item's recency
  2. Automatic Eviction: When capacity is reached, the LRU item is automatically removed
  3. Efficient: All operations run in constant time O(1)
  4. Space-Time Tradeoff: Uses extra space (HashMap) to achieve O(1) time complexity

Advantages

  • Fast Operations: O(1) time for both get and put
  • Automatic Management: No manual cleanup needed
  • Predictable Behavior: Clear eviction policy
  • Memory Efficient: Fixed memory footprint

Disadvantages

  • Fixed Capacity: Cannot grow beyond initial capacity
  • No Expiration: Items don't expire based on time (only on usage)
  • Memory Overhead: Requires additional space for HashMap and pointers

Variations

  • LFU (Least Frequently Used): Evicts items used least often
  • FIFO (First In First Out): Evicts oldest items regardless of usage
  • LRU-K: Considers K most recent accesses instead of just the last one
  • 2Q: Uses two queues to better handle different access patterns

Interview Tips

LRU Cache is a popular interview question because it tests:

  • Understanding of data structures (HashMap, Linked List)
  • Ability to combine multiple data structures
  • Time/space complexity analysis
  • Real-world system design knowledge

References

Author

Akshay Juluri