An O(1) LRU cache. Short, sweet, and fast.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 | class LRU_Cache:
def __init__(self, user_function, maxsize=1024):
# Link layout: [PREV, NEXT, KEY, RESULT]
self.root = root = [None, None, None, None]
self.user_function = user_function
self.cache = cache = {}
last = root
for i in range(maxsize):
key = object()
cache[key] = last[1] = last = [last, root, key, None]
root[0] = last
def __call__(self, *key):
cache = self.cache
root = self.root
link = cache.get(key)
if link is not None:
link_prev, link_next, _, result = link
link_prev[1] = link_next
link_next[0] = link_prev
last = root[0]
last[1] = root[0] = link
link[0] = last
link[1] = root
return result
result = self.user_function(*key)
root[2] = key
root[3] = result
oldroot = root
root = self.root = root[1]
root[2], oldkey = None, root[2]
root[3], oldvalue = None, root[3]
del cache[oldkey]
cache[key] = oldroot
return result
if __name__ == '__main__':
p = LRU_Cache(ord, maxsize=3)
for c in 'abcdecaeaa':
print(c, p(c))
|
Simplified and optimized from the version that was added to the standard library in Python 3.2.
The main optimization is to simplify the functionality (no keyword arguments, no tracking of the hit/miss ratio, and no clear() method). The next major optimization was to inline the relevant code from Python's OrderedDict implementation. Another optimization is to use only one dictionary (the OrderedDict version uses two). That dict points at links stored in a list organized as [PREV, NEXT, KEY, VALUE]
.
Using only basic list/dict operations, this code uses dirt-simple Python code that runs well on PyPy, Jython, and any version of CPython version 1.6 or later. It also works great with Psyco.