At Acme Analytics, repeated lookups for the same records are slowing down application performance. Implement an LRU (Least Recently Used) cache to optimize repeated data retrieval by keeping only the most recently accessed items in memory.
Design a class LRUCache that supports these operations in O(1) average time:
get(key) — return the value for key if it exists, otherwise return -1.put(key, value) — insert or update the key-value pair. If the cache exceeds its capacity, evict the least recently used key.capacity, followed by a sequence of operations: get(key) and put(key, value).get(key), return the stored value or -1 if the key is not present.Example 1
capacity = 2, operations = [put(1,1), put(2,2), get(1), put(3,3), get(2)][1, -1]get(1) makes key 1 most recently used. Adding key 3 evicts key 2.Example 2
capacity = 1, operations = [put(2,1), get(2), put(3,2), get(2), get(3)][1, -1, 2]1, inserting key 3 evicts key 2.1 <= capacity <= 10^50 <= key, value <= 10^92 * 10^5 total operationsget and put should run in O(1) average time