A simple Python cache which supports item-level expiration.
The cache uses a dictionary to store key-value pairs. Values are stored as a tuple of the value and the expiration time. The expiration time is calculated by adding the TTL to the current time.
This cache can be used in various scenarios where caching is required, such as improving performance in data-intensive applications or reducing API calls.
git-clone this repository and import the cache class into your project.
git clone https://github.com/llorrac1/python-ttl-cache.gitImporting with either
from ttl_cache.cache import ttl_cacheor
from ttl_cache import TTLCacheUsing ttl_cache.cache import ttl_cache will allow you to use ttl_cache as a decorator with no additional setup.
from ttl_cache.cache import ttl_cache
mycache = ttl_cache(maxsize=5, ttl=10)
@mycache
def fib(n):
if n < 2:
return n
return fib(n-1) + fib(n-2)
if __name__ == '__main__':
for i in range(10):
print(fib(i))
print(mycache.get_cache_info())
print(mycache.clear())
print(mycache.get_cache_info())I created this project to address the need for a simple Python cache that supports item-level expiration. By using a dictionary to store key-value pairs and calculating expiration time based on TTL (Time to Live), this cache provides a convenient way to manage and retrieve cached data efficiently. It can be used in various scenarios where caching is required, such as improving performance in data-intensive applications or reducing API calls. Although you could use inbuilt Python libraries such as functools.lru_cache or cachetools, these tools do not provide item-level ttl out of the box. As such, this project provides a simple and lightweight alternative.