tapis_cli.hashcache package¶
Extends Python hashing to support memoization.
Extends Python’s hashing support to support memoizing and serialization of functions with complex parameters and/or returns.
-
tapis_cli.hashcache.
lru_cache
(maxsize=128, typed=False)¶ Least-recently-used cache decorator.
If maxsize is set to None, the LRU features are disabled and the cache can grow without bound.
If typed is True, arguments of different types will be cached separately. For example, f(3.0) and f(3) will be treated as distinct calls with distinct results.
Arguments to the cached function must be hashable.
View the cache statistics named tuple (hits, misses, maxsize, currsize) with f.cache_info(). Clear the cache and statistics with f.cache_clear(). Access the underlying function with f.__wrapped__.
See: http://en.wikipedia.org/wiki/Cache_replacement_policies#Least_recently_used_(LRU)
Submodules¶
tapis_cli.hashcache.jsoncache module¶
A memoizing cache built on the Python json
module
-
tapis_cli.hashcache.jsoncache.
mcache
(cache)¶
tapis_cli.hashcache.lru_py27 module¶
Backport of Python 3.3 lru_cache to Python 2.7
-
tapis_cli.hashcache.lru_py27.
lru_cache
(maxsize=100, typed=False)¶ Least-recently-used cache decorator.
If maxsize is set to None, the LRU features are disabled and the cache can grow without bound.
If typed is True, arguments of different types will be cached separately. For example, f(3.0) and f(3) will be treated as distinct calls with distinct results.
Arguments to the cached function must be hashable.
View the cache statistics named tuple (hits, misses, maxsize, currsize) with f.cache_info(). Clear the cache and statistics with f.cache_clear(). Access the underlying function with f.__wrapped__.
See: http://en.wikipedia.org/wiki/Cache_algorithms#Least_Recently_Used