Skip to main content

True to its name, `funccache` implements the cache function. Cache the return value of a callable object or all methods defined in a class.

Project description

LOGO Release Python Versions License Downloads

funccache

English | 中文

funccache, as its name suggests, is a framework developed by the GQYLPY team that implements function caching. It can cache the return values of specific functions or all methods defined within a class.

What if you have a function in your program that gets called multiple times and always returns the same value? To improve code efficiency, you might call the function once, store the return value in a variable, and then use that variable instead of repeatedly calling the function. Sound familiar? That's great! But now, we're here to introduce a more elegant solution: using the funccache module to directly cache function return values.

funccache can be used in two ways: as a metaclass to cache the return values of all methods defined in its metaclass instances, or as a decorator to cache the return values of decorated functions.

pip3 install funccache

Caching Return Values of Class Methods

import funccache

class Alpha(metaclass=funccache):
    ...

In this case, all methods and property attributes defined in the Alpha class will have their return values cached in the __cache_pool__ attribute after being called once by an instance of the class. Subsequent calls, as long as the parameters remain the same, will directly retrieve values from __cache_pool__ without re-executing the related code, significantly reducing program overhead and improving code readability.

By default, this caching functionality only applies to individual instances, and each instance has its own __cache_pool__ attribute. However, if you want all instances of Alpha to share the same cache, you can enable the __shared_instance_cache__ attribute:

class Alpha(metaclass=funccache):
    __shared_instance_cache__ = True

Setting the class attribute __shared_instance_cache__ = True creates the __cache_pool__ attribute in the Alpha class itself, rather than in each individual instance of Alpha.

The cache never expires by default, but you can set a time-to-live (TTL) for the cache using the class attribute __ttl__:

class Alpha(metaclass=funccache):
    __ttl__ = 60

If you want a specific method or property to not be cached, you can add it to the __not_cache__ list:

class Alpha(metaclass=funccache):
    __not_cache__ = [method_obj_or_method_name, ...]

Additionally, subclasses of Alpha will also inherit the caching functionality.

Caching Return Values of Functions

import funccache

@funccache
def alpha():
    ...

In this case, the return value of the alpha function will be cached after it is called once. Subsequent calls, as long as the parameters remain the same, will directly retrieve the value from the cache without re-executing the alpha function.

The cache never expires by default, but if you want the cache to expire after a certain period, you can use funccache.ttl:

@funccache.ttl(60)
def alpha():
    ...

You can even cache based on the number of calls using funccache.count:

@funccache.count(3)
def alpha():
    ...

The decorator usage can also achieve singleton class behavior, as long as the instantiation parameters are consistent:

@funccache
class Alpha:
    ...

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

funccache-2.0.1.tar.gz (12.3 kB view hashes)

Uploaded Source

Built Distribution

funccache-2.0.1-py3-none-any.whl (12.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page