Skip to main content

collections-cache is a Python package for managing data collections across multiple SQLite databases. It allows efficient storage, retrieval, and updating of key-value pairs, supporting various data types serialized with pickle. The package uses parallel processing for fast access and manipulation of large collections.

Project description

collections-cache 🚀

collections-cache is a fast and scalable key–value caching solution built on top of SQLite. It allows you to store, update, and retrieve data using unique keys, and it supports complex Python data types (thanks to pickle). Designed to harness the power of multiple CPU cores, the library shards data across multiple SQLite databases, enabling impressive performance scaling.


Features ✨

  • Multiple SQLite Databases: Distributes your data across several databases to optimize I/O and take advantage of multi-core systems.
  • Key–Value Store: Simple and intuitive interface for storing and retrieving data.
  • Supports Complex Data Types: Serialize and store lists, dictionaries, objects, and more using pickle.
  • Parallel Processing: Uses Python’s multiprocessing and concurrent.futures modules to perform operations in parallel.
  • Efficient Data Retrieval: Caches all keys in memory for super-fast lookups.
  • Cross-Platform: Runs on Linux, macOS, and Windows.
  • Performance Scaling: Benchmarks show near-linear scaling with the number of real CPU cores.

Installation 📦

Use Poetry to install and manage dependencies:

  1. Clone the repository:

    git clone https://github.com/Luiz-Trindade/collections_cache.git
    cd collections-cache
    
  2. Install the package with Poetry:

    poetry install
    

Usage ⚙️

Simply import and start using the main class, Collection_Cache, to interact with your collection:

Basic Example

from collections_cache import Collection_Cache

# Create a new collection named "STORE"
cache = Collection_Cache("STORE")

# Set a key-value pair
cache.set_key("products", ["apple", "orange", "onion"])

# Retrieve the value by key
products = cache.get_key("products")
print(products)  # Output: ['apple', 'orange', 'onion']

Bulk Insertion Example

For faster insertions, accumulate your data and use set_multi_keys:

from collections_cache import Collection_Cache
from random import uniform, randint
from time import time

cache = Collection_Cache("web_cache")
insertions = 100_000
data = {}

# Generate data
for i in range(insertions):
    key = str(uniform(0.0, 100.0))
    value = "some text :)" * randint(1, 100)
    data[key] = value

# Bulk insert keys using multi-threaded execution
cache.set_multi_keys(data)

print(f"Inserted {len(cache.keys())} keys successfully!")

Performance Benchmark 📊

After optimizing SQLite settings (including setting synchronous = OFF), the library has shown a significant performance improvement. The insertion performance has been accelerated dramatically, allowing for much faster data insertions and better scalability.

Benchmark Results

For 100,000 insertions:

  • Previous performance: ~797 insertions per second.
  • Optimized performance: ~6,657 insertions per second after disabling SQLite's synchronization (synchronous = OFF), reducing the total insertion time from 125 seconds to 15.02 seconds.

Performance Scaling

With the optimized configuration, the library scales nearly linearly with the number of CPU cores. For example:

  • 4 cores: ~6,657 insertions per second.
  • 8 cores: ~13,300 insertions per second.
  • 16 cores: ~26,600 insertions per second.
  • 32 cores: ~53,200 insertions per second.
  • 128 cores: ~212,000 insertions per second (theoretically).

Note: Actual performance may vary depending on system architecture, disk I/O, and specific workload, but benchmarks indicate a substantial increase in insertion rate as the number of CPU cores increases.


API Overview 📚

  • set_key(key, value): Stores a key–value pair. Updates the value if the key already exists.
  • set_multi_keys(key_and_value): (Experimental) Inserts multiple key–value pairs in parallel.
  • get_key(key): Retrieves the value associated with a given key.
  • delete_key(key): Removes a key and its corresponding value.
  • keys(): Returns a list of all stored keys.
  • export_to_json(): (Future feature) Exports your collection to a JSON file.

Development & Contributing 👩‍💻👨‍💻

To contribute or run tests:

  1. Install development dependencies:

    poetry install --dev
    
  2. Run tests using:

    poetry run pytest
    

Feel free to submit issues, pull requests, or feature suggestions. Your contributions help make collections-cache even better!


License 📄

This project is licensed under the MIT License. See the LICENSE file for details.


Acknowledgements 🙌

  • Inspired by the need for efficient, multi-core caching with SQLite.
  • Created by Luiz Trindade.
  • Thanks to all contributors and users who provide feedback to keep improving the library!

Give collections-cache a try and let it power your high-performance caching needs! 🚀

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

collections_cache-0.3.5.20250420.tar.gz (5.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

collections_cache-0.3.5.20250420-py3-none-any.whl (6.5 kB view details)

Uploaded Python 3

File details

Details for the file collections_cache-0.3.5.20250420.tar.gz.

File metadata

  • Download URL: collections_cache-0.3.5.20250420.tar.gz
  • Upload date:
  • Size: 5.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.1 CPython/3.12.3 Linux/6.11.0-24-generic

File hashes

Hashes for collections_cache-0.3.5.20250420.tar.gz
Algorithm Hash digest
SHA256 a16f6162731ef91776dac9fb769aa2a2998db2d3b2d84c41bb31fdf676f7ae01
MD5 d349cc42e1872eb13505f295fc3928aa
BLAKE2b-256 47f3b5129dbd503f176e847bc36209069347c556d3eeb8d05840c37759f6c8d9

See more details on using hashes here.

File details

Details for the file collections_cache-0.3.5.20250420-py3-none-any.whl.

File metadata

File hashes

Hashes for collections_cache-0.3.5.20250420-py3-none-any.whl
Algorithm Hash digest
SHA256 4e3791546f7e597dd59873c12ff48961a85f6b618cbf5def7eb5b8dedd0ae756
MD5 2e6cf7c6fbfcf578c21b4598ce095a9e
BLAKE2b-256 acc41de844689cc29cdbc26c1c9fac95cf1266bf958a3a93e4f067af938cfa8b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page