Skip to main content

Library to check equality between two complex/nested objects

Project description

coola

CI Nightly Tests Nightly Package Tests
Documentation Documentation
Codecov
Code style: black Doc style: google Ruff Doc style: google
PYPI version Python BSD-3-Clause
Downloads Monthly downloads

Overview

coola is a Python library that provides simple functions to check in a single line if two complex/nested objects are equal or not. coola was initially designed to work with PyTorch Tensors and NumPy ndarray, but it is possible to extend it to support other data structures.

Motivation

Let's imagine you have the following dictionaries that contain both a PyTorch Tensor and a NumPy ndarray. You want to check if the two dictionaries are equal or not. By default, Python does not provide an easy way to check if the two dictionaries are equal or not. It is not possible to use the default equality operator == because it will raise an error. The coola library was developed to fill this gap. coola provides a function objects_are_equal that can indicate if two complex/nested objects are equal or not.

>>> import numpy
>>> import torch
>>> from coola import objects_are_equal
>>> data1 = {"torch": torch.ones(2, 3), "numpy": numpy.zeros((2, 3))}
>>> data2 = {"torch": torch.zeros(2, 3), "numpy": numpy.ones((2, 3))}
>>> objects_are_equal(data1, data2)
False

coola also provides a function objects_are_allclose that can indicate if two complex/nested objects are equal within a tolerance or not.

>>> import numpy
>>> import torch
>>> from coola import objects_are_allclose
>>> data1 = {"torch": torch.ones(2, 3), "numpy": numpy.zeros((2, 3))}
>>> data2 = {"torch": torch.zeros(2, 3), "numpy": numpy.ones((2, 3))}
>>> objects_are_allclose(data1, data2, atol=1e-6)
False

coola supports the following types:

Please check the quickstart page to learn more on how to use coola.

Documentation

  • latest (stable): documentation from the latest stable release.
  • main (unstable): documentation associated to the main branch of the repo. This documentation may contain a lot of work-in-progress/outdated/missing parts.

Installation

We highly recommend installing a virtual environment. coola can be installed from pip using the following command:

pip install coola

To make the package as slim as possible, only the minimal packages required to use coola are installed. To include all the dependencies, you can use the following command:

pip install coola[all]

Please check the get started page to see how to install only some specific dependencies or other alternatives to install the library. The following is the corresponding coola versions and tested dependencies.

coola jax* numpy* pandas* polars* pyarrow* torch* xarray* python
main >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.14
0.8.5 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.14
0.8.4 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.14
0.8.3 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.8.2 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.8.1 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.8.0 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.7.4 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.7.3 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.7.2 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.7.1 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<1.0 >=1.10,<3.0 >=2023.1 >=3.9,<3.13
0.7.0 >=0.4.1,<1.0 >=1.21,<2.0 >=1.3,<3.0 >=0.18.3,<1.0 >=1.10,<3.0 >=2023.1 >=3.9,<3.13

* indicates an optional dependency

older versions
coola jax* numpy* pandas* polars* torch* xarray* python
0.6.2 >=0.4.1,<1.0 >=1.21,<2.0 >=1.3,<3.0 >=0.18.3,<1.0 >=1.10,<3.0 >=2023.1 >=3.9,<3.13
0.6.1 >=0.4.1,<1.0 >=1.21,<2.0 >=1.3,<3.0 >=0.18.3,<1.0 >=1.10,<3.0 >=2023.1 >=3.9,<3.13
0.6.0 >=0.4.1,<1.0 >=1.21,<2.0 >=1.3,<3.0 >=0.18.3,<1.0 >=1.10,<3.0 >=2023.1 >=3.9,<3.13
0.5.0 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.4.0 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.3.1 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.3.0 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.2.2 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.2.1 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.2.0 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.1.2 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.21 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.1.1 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.20 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.1.0 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.20 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.12
0.0.26 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.20 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.12
0.0.25 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.20 >=1.10,<2.2 >=2023.4,<2023.11 >=3.9,<3.12
0.0.24 >=0.3,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.20 >=1.10,<2.2 >=2023.3,<2023.9 >=3.9,<3.12
0.0.23 >=0.3,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.20 >=1.10,<2.1 >=2023.3,<2023.9 >=3.9,<3.12
0.0.22 >=0.3,<0.5 >=1.20,<1.26 >=1.3,<2.1 >=0.18.3,<0.19 >=1.10,<2.1 >=2023.3,<2023.9 >=3.9,<3.12
0.0.21 >=0.3,<0.5 >=1.20,<1.26 >=1.3,<2.1 >=0.18.3,<0.19 >=1.10,<2.1 >=2023.3,<2023.8 >=3.9,<3.12
0.0.20 >=0.3,<0.5 >=1.20,<1.26 >=1.3,<2.1 >=0.18.3,<0.19 >=1.10,<2.1 >=2023.3,<2023.8 >=3.9

Contributing

Please check the instructions in CONTRIBUTING.md.

Suggestions and Communication

Everyone is welcome to contribute to the community. If you have any questions or suggestions, you can submit Github Issues. We will reply to you as soon as possible. Thank you very much.

API stability

:warning: While coola is in development stage, no API is guaranteed to be stable from one release to the next. In fact, it is very likely that the API will change multiple times before a stable 1.0.0 release. In practice, this means that upgrading coola to a new version will possibly break any code that was using the old version of coola.

License

coola is licensed under BSD 3-Clause "New" or "Revised" license available in LICENSE file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coola-0.8.5.tar.gz (46.9 kB view details)

Uploaded Source

Built Distribution

coola-0.8.5-py3-none-any.whl (83.0 kB view details)

Uploaded Python 3

File details

Details for the file coola-0.8.5.tar.gz.

File metadata

  • Download URL: coola-0.8.5.tar.gz
  • Upload date:
  • Size: 46.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.12.7 Linux/6.5.0-1025-azure

File hashes

Hashes for coola-0.8.5.tar.gz
Algorithm Hash digest
SHA256 63dfb9fa1f8fc2c6ed7de817d660352c16db7bd84dda1d4f9e98aa535c3199d6
MD5 c08b84290b8bb472795d95f36f348535
BLAKE2b-256 fea276a247862fdbe0bcafc50119a15eac3dd04862c655a05128634c32ba400f

See more details on using hashes here.

File details

Details for the file coola-0.8.5-py3-none-any.whl.

File metadata

  • Download URL: coola-0.8.5-py3-none-any.whl
  • Upload date:
  • Size: 83.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.12.7 Linux/6.5.0-1025-azure

File hashes

Hashes for coola-0.8.5-py3-none-any.whl
Algorithm Hash digest
SHA256 d4f218fbb052f5352977e7a6d7ba805de1d41e388ad5236c617dfa0d4144ca99
MD5 539a43636a265d5a072ebb58d10b34bc
BLAKE2b-256 4961603cc874391c7f6c7f60dc5c6f1d4f1daee1c9d65673c87473cdf48b172f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page