Lazy computing tree cache library
Lazy tree evaluation cache library.
The library implements a cache of dependent lazy calculations for working with clean, time-consuming computational tasks, such as symbolic transformations, geometric, numerical algorithms.
The task of the library is to save the result of the computation once performed and, if necessary, load it, saving the computing resources. The algorithm for constructing the hashkey of the computed object uses the input data parameterizing this object, which makes it possible to track changes in the arguments of the lazy algorithm and to postpone the necessary calculations if the conditions have changed. If an lazy object is used as an argument or a generating function, its hashkey is used as its hash. This allows you to build a dependent computational tree. If the input data of an object changes, its hashkey and hashkeys of all objects computed on its basis change. And the subtree will be reevaluated.
Since the library saves every computed object in the cache, including intermediate objects, it can pick up changes in the calculation tree from any step. Thus, previously received data, if they can be applied to a new calculation tree, will be used. This allows you to not make heavy preliminary calculations in separate files, and load them transparently, and also compare results with small changes in input parameters without multiple results remaking.
python3 -m pip install evalcache
import evalcache lazy = evalcache.Lazy(cache = evalcache.DirCache(".evalcache")) @lazy def func(a,b,c): return do_something(a,b,c) lazyresult = func(1,2,3) result = lazyresult.unlazy() #alternative: result = evalcache.unlazy(lazyresult)
In that example we can see based classes and objects: You should instance "evalcache.Lazy" for start work. "Lazy" get "cache" as parametr. Cache is a dict-like object those will store and load our evaluation's results. "Lazy" instance "lazy" can be used as decorator for create "LazyObjects". Decorated object "func" is a LazyObject. "func" can generate another lazyobject, as "lazyresult", for example with callable interface. For get evaluation result we use "unlazy" method.
We can visualize cache operations:
lazy = evalcache.Lazy(cache = cache, diag = True)
in this mode, when you use unlazy, you will see console output:
endp - get endpoint object.
fget - get variable from local object store.
load - get early stored value from cache.
save - evaluation executed and value stored. eval - evaluated without storing
You can choose algorithm from hashlib or specify user's hashlib-like algorithm.
lazy = evalcache.Lazy(cache = cache, algo = hashlib.sha512)
DirCache is a dict-like object that used pickle to store values in key-named files. It very simple cache and it can be changed to more progressive option if need.
lazy = evalcache.Lazy(cache = evalcache.DirCache(".evalcache"))
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size evalcache-1.11.0.tar.gz (14.0 kB)||File type Source||Python version None||Upload date||Hashes View|