我想在执行过程中通过某些类方法激活或停用“缓存”。
我找到了一种通过以下方式激活它的方法:
(...)
setattr(self, "_greedy_function", my_cache_decorator(self._cache)(getattr(self, "_greedy_function")))
(...)
其中self._cache
是我自己的缓存对象,用于存储self._greedy_function
的结果。
它工作正常,但是现在如果我要停用缓存并“取消装饰” _greedy_function
,该怎么办?
我看到了一个可能的解决方案,在修饰之前存储_greedy_function
的引用,但是也许有一种方法可以从修饰的函数中检索它,这会更好。
根据要求,这是装饰器和我用来缓存类函数结果的缓存对象:
import logging
from collections import OrderedDict, namedtuple
from functools import wraps
logging.basicConfig(
level=logging.WARNING,
format='%(asctime)s %(name)s %(levelname)s %(message)s'
)
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)
CacheInfo = namedtuple("CacheInfo", "hits misses maxsize currsize")
def lru_cache(cache):
"""
A replacement for functools.lru_cache() build on a custom LRU Class.
It can cache class methods.
"""
def decorator(func):
logger.debug("assigning cache %r to function %s" % (cache, func.__name__))
@wraps(func)
def wrapped_func(*args, **kwargs):
try:
ret = cache[args]
logger.debug("cached value returned for function %s" % func.__name__)
return ret
except KeyError:
try:
ret = func(*args, **kwargs)
except:
raise
else:
logger.debug("cache updated for function %s" % func.__name__)
cache[args] = ret
return ret
return wrapped_func
return decorator
class LRU(OrderedDict):
"""
Custom implementation of a LRU cache, build on top of an Ordered dict.
"""
__slots__ = "_hits", "_misses", "_maxsize"
def __new__(cls, maxsize=128):
if maxsize is None:
return None
return super().__new__(cls, maxsize=maxsize)
def __init__(self, maxsize=128, *args, **kwargs):
self.maxsize = maxsize
self._hits = 0
self._misses = 0
super().__init__(*args, **kwargs)
def __getitem__(self, key):
try:
value = super().__getitem__(key)
except KeyError:
self._misses += 1
raise
else:
self.move_to_end(key)
self._hits += 1
return value
def __setitem__(self, key, value):
super().__setitem__(key, value)
if len(self) > self._maxsize:
oldest, = next(iter(self))
del self[oldest]
def __delitem__(self, key):
try:
super().__delitem__((key,))
except KeyError:
pass
def __repr__(self):
return "<%s object at %s: %s>" % (self.__class__.__name__, hex(id(self)), self.cache_info())
def cache_info(self):
return CacheInfo(self._hits, self._misses, self._maxsize, len(self))
def clear(self):
super().clear()
self._hits, self._misses = 0, 0
@property
def maxsize(self):
return self._maxsize
@maxsize.setter
def maxsize(self, maxsize):
if not isinstance(maxsize, int):
raise TypeError
elif maxsize < 2:
raise ValueError
elif maxsize & (maxsize - 1) != 0:
logger.warning("LRU feature performs best when maxsize is a power-of-two, maybe.")
while maxsize < len(self):
oldest, = next(iter(self))
print(oldest)
del self[oldest]
self._maxsize = maxsize
编辑:我已经使用注释中建议的__wrapped__属性更新了代码,并且效果很好!整个过程在这里:https://gist.github.com/fbparis/b3ddd5673b603b42c880974b23db7cda(kik.set_cache()方法...)
答案 0 :(得分:4)
您使事情变得太复杂了。装饰器可以简单地由del self._greedy_function
删除。不需要__wrapped__
属性。
这是set_cache
和unset_cache
方法的最小实现:
class LRU(OrderedDict):
def __init__(self, maxsize=128, *args, **kwargs):
# ...
self._cache = dict()
super().__init__(*args, **kwargs)
def _greedy_function(self):
time.sleep(1)
return time.time()
def set_cache(self):
self._greedy_function = lru_cache(self._cache)(getattr(self, "_greedy_function"))
def unset_cache(self):
del self._greedy_function
使用装饰器lru_cache
,以下是结果
o = LRU()
o.set_cache()
print('First call', o._greedy_function())
print('Second call',o._greedy_function()) # Here it prints out the cached value
o.unset_cache()
print('Third call', o._greedy_function()) # The cache is not used
输出
First call 1552966668.735025
Second call 1552966668.735025
Third call 1552966669.7354007
答案 1 :(得分:2)
现代版本的functools.wraps
在其创建的包装器上将原始功能作为属性__wrapped__
安装。 (一个人可以在通常用于此目的的嵌套函数中搜索__closure__
,但也可以使用其他类型。)可以期望任何包装程序都遵循此约定。
一种替代方法是具有一个永久包装器,该包装器可以由 flag 进行控制,以便可以启用和禁用该包装,而无需将其删除和恢复。这具有包装器可以保持其状态(此处为缓存的值)的优点。该标志可以是一个单独的变量(例如, ,是带有包装功能的对象的另一个属性,如果有的话),也可以是包装器本身的一个属性。