Skip to main content

Optimized PyTree Utilities.

Reason this release was yanked:

old version

Project description

OpTree

Python 3.7+ PyPI GitHub Workflow Status GitHub Workflow Status Codecov Documentation Status Downloads GitHub Repo Stars

Optimized PyTree Utilities.


Table of Contents


Installation

Install from PyPI (PyPI / Status):

pip3 install --upgrade optree

Install from conda-forge (conda-forge):

conda install -c conda-forge optree

Install the latest version from GitHub:

pip3 install git+https://github.com/metaopt/optree.git#egg=optree

Or, clone this repo and install manually:

git clone --depth=1 https://github.com/metaopt/optree.git
cd optree
pip3 install .

Compiling from the source requires Python 3.7+, a compiler (gcc / clang / icc / cl.exe) that supports C++20 and a cmake installation.


PyTrees

A PyTree is a recursive structure that can be an arbitrarily nested Python container (e.g., tuple, list, dict, OrderedDict, NamedTuple, etc.) or an opaque Python object. The key concepts of tree operations are tree flattening and its inverse (tree unflattening). Additional tree operations can be performed based on these two basic functions (e.g., tree_map = tree_unflatten ∘ map ∘ tree_flatten).

Tree flattening is traversing the entire tree in a left-to-right depth-first manner and returning the leaves of the tree in a deterministic order.

>>> tree = {'b': (2, [3, 4]), 'a': 1, 'c': 5, 'd': 6}
>>> optree.tree_flatten(tree)
([1, 2, 3, 4, 5, 6], PyTreeSpec({'a': *, 'b': (*, [*, *]), 'c': *, 'd': *}))
>>> optree.tree_flatten(1)
([1], PyTreeSpec(*))
>>> optree.tree_flatten(None)
([], PyTreeSpec(None))

This usually implies that the equal pytrees return equal lists of leaves and the same tree structure. See also section Key Ordering for Dictionaries.

>>> {'a': [1, 2], 'b': [3]} == {'b': [3], 'a': [1, 2]}
True
>>> optree.tree_leaves({'a': [1, 2], 'b': [3]}) == optree.tree_leaves({'b': [3], 'a': [1, 2]})
True
>>> optree.tree_structure({'a': [1, 2], 'b': [3]}) == optree.tree_structure({'b': [3], 'a': [1, 2]})
True

Tree Nodes and Leaves

A tree is a collection of non-leaf nodes and leaf nodes, where the leaf nodes have no children to flatten. optree.tree_flatten(...) will flatten the tree and return a list of leaf nodes while the non-leaf nodes will store in the tree specification.

Built-in PyTree Node Types

OpTree out-of-box supports the following Python container types in the registry:

which are considered non-leaf nodes in the tree. Python objects that the type is not registered will be treated as leaf nodes. The registration lookup uses the is operator to determine whether the type is matched. So subclasses will need to explicitly register in the registry, otherwise, an object of that type will be considered a leaf. The NoneType is a special case discussed in section None is non-leaf Node vs. None is Leaf.

Registering a Container-like Custom Type as Non-leaf Nodes

A container-like Python type can be registered in the type registry with a pair of functions that specify:

  • flatten_func(container) -> (children, metadata, entries): convert an instance of the container type to a (children, metadata, entries) triple, where children is an iterable of subtrees and entries is an iterable of path entries of the container (e.g., indices or keys).
  • unflatten_func(metadata, children) -> container: convert such a pair back to an instance of the container type.

The metadata is some necessary data apart from the children to reconstruct the container, e.g., the keys of the dictionary (the children are values).

The entries can be omitted (only returns a pair) or is optional to implement (returns None). If so, use range(len(children)) (i.e., flat indices) as path entries of the current node. The function signature can be flatten_func(container) -> (children, metadata) or flatten_func(container) -> (children, metadata, None).

The following examples show how to register custom types and utilize them for tree_flatten and tree_map. Please refer to section Notes about the PyTree Type Registry for more information.

# Registry a Python type with lambda functions
optree.register_pytree_node(
    set,
    # (set) -> (children, metadata, None)
    lambda s: (sorted(s), None, None),
    # (metadata, children) -> (set)
    lambda _, children: set(children),
    namespace='set',
)

# Register a Python type into a namespace
import torch

optree.register_pytree_node(
    torch.Tensor,
    # (tensor) -> (children, metadata)
    flatten_func=lambda tensor: (
        (tensor.cpu().numpy(),),
        dict(dtype=tensor.dtype, device=tensor.device, requires_grad=tensor.requires_grad),
    ),
    # (metadata, children) -> tensor
    unflatten_func=lambda metadata, children: torch.tensor(children[0], **metadata),
    namespace='torch2numpy',
)
>>> tree = {'weight': torch.ones(size=(1, 2)).cuda(), 'bias': torch.zeros(size=(2,))}
>>> tree
{'weight': tensor([[1., 1.]], device='cuda:0'), 'bias': tensor([0., 0.])}

# Flatten without specifying the namespace
>>> tree_flatten(tree)  # `torch.Tensor`s are leaf nodes
([tensor([0., 0.]), tensor([[1., 1.]], device='cuda:0')], PyTreeSpec({'bias': *, 'weight': *}))

# Flatten with the namespace
>>> leaves, treespec = optree.tree_flatten(tree, namespace='torch2numpy')
>>> leaves, treespec
(
    [array([0., 0.], dtype=float32), array([[1., 1.]], dtype=float32)],
    PyTreeSpec(
        {
            'bias': CustomTreeNode(Tensor[{'dtype': torch.float32, 'device': device(type='cpu'), 'requires_grad': False}], [*]),
            'weight': CustomTreeNode(Tensor[{'dtype': torch.float32, 'device': device(type='cuda', index=0), 'requires_grad': False}], [*])
        },
        namespace='torch2numpy'
    )
)

# `entries` are not defined and use `range(len(children))`
>>> optree.tree_paths(tree, namespace='torch2numpy')
[('bias', 0), ('weight', 0)]

# Unflatten back to a copy of the original object
>>> optree.tree_unflatten(treespec, leaves)
{'bias': tensor([0., 0.]), 'weight': tensor([[1., 1.]], device='cuda:0')}

Users can also extend the pytree registry by decorating the custom class and defining an instance method tree_flatten and a class method tree_unflatten.

from collections import UserDict

@optree.register_pytree_node_class(namespace='mydict')
class MyDict(UserDict):
    def tree_flatten(self):  # -> (children, metadata, entries)
        reversed_keys = sorted(self.keys(), reverse=True)
        return (
            [self[key] for key in reversed_keys],  # children
            reversed_keys,  # metadata
            reversed_keys,  # entries
        )

    @classmethod
    def tree_unflatten(cls, metadata, children):
        return cls(zip(metadata, children))
>>> tree = MyDict(b=4, a=(2, 3), c=MyDict({'d': 5, 'f': 6}))

# Flatten without specifying the namespace
>>> optree.tree_flatten_with_path(tree)  # `MyDict`s are leaf nodes
(
    [()],
    [MyDict(b=4, a=(2, 3), c=MyDict({'d': 5, 'f': 6}))],
    PyTreeSpec(*)
)

# Flatten with the namespace
>>> optree.tree_flatten_with_path(tree, namespace='mydict')
(
    [('c', 'f'), ('c', 'd'), ('b',), ('a', 0), ('a', 1)],
    [6, 5, 4, 2, 3],
    PyTreeSpec(
        CustomTreeNode(MyDict[['c', 'b', 'a']], [CustomTreeNode(MyDict[['f', 'd']], [*, *]), *, (*, *)]),
        namespace='mydict'
    )
)

Notes about the PyTree Type Registry

There are several key attributes of the pytree type registry:

  1. The type registry is per-interpreter-dependent. This means registering a custom type in the registry affects all modules that use OpTree.

    - !!! WARNING !!!
      For safety reasons, a `namespace` must be specified while registering a custom type. It is
      used to isolate the behavior of flattening and unflattening a pytree node type. This is to
      prevent accidental collisions between different libraries that may register the same type.
    
  2. The elements in the type registry are immutable. Users can neither register the same type twice in the same namespace (i.e., update the type registry), nor remove a type from the type registry. To update the behavior of an already registered type, simply register it again with another namespace.

  3. Users cannot modify the behavior of already registered built-in types listed in Built-in PyTree Node Types, such as key order sorting for dict and collections.defaultdict.

  4. Inherited subclasses are not implicitly registered. The registration lookup uses type(obj) is registered_type rather than isinstance(obj, registered_type). Users need to register the subclasses explicitly. To register all subclasses, it is easy to implement with metaclass or __init_subclass__, for example:

    from collections import UserDict
    
    @optree.register_pytree_node_class(namespace='mydict')
    class MyDict(UserDict):
        def __init_subclass__(cls):  # define this in the base class
            super().__init_subclass__()
            # Register a subclass to namespace 'mydict'
            optree.register_pytree_node_class(cls, namespace='mydict')
    
        def tree_flatten(self):  # -> (children, metadata, entries)
            reversed_keys = sorted(self.keys(), reverse=True)
            return (
                [self[key] for key in reversed_keys],  # children
                reversed_keys,  # metadata
                reversed_keys,  # entries
            )
    
        @classmethod
        def tree_unflatten(cls, metadata, children):
            return cls(zip(metadata, children))
    
    # Subclasses will be automatically registered in namespace 'mydict'
    class MyAnotherDict(MyDict):
        pass
    
    >>> tree = MyDict(b=4, a=(2, 3), c=MyAnotherDict({'d': 5, 'f': 6}))
    >>> optree.tree_flatten_with_path(tree, namespace='mydict')
    (
        [('c', 'f'), ('c', 'd'), ('b',), ('a', 0), ('a', 1)],
        [6, 5, 4, 2, 3],
        PyTreeSpec(
            CustomTreeNode(MyDict[['c', 'b', 'a']], [CustomTreeNode(MyAnotherDict[['f', 'd']], [*, *]), *, (*, *)]),
            namespace='mydict'
        )
    )
    
  5. Be careful about the potential infinite recursion of the custom flatten function. The returned children from the custom flatten function are considered subtrees. They will be further flattened recursively. The children can have the same type as the current node. Users must design their termination condition carefully.

    import numpy as np
    import torch
    
    optree.register_pytree_node(
        np.ndarray,
        # Children are nest lists of Python objects
        lambda array: (np.atleast_1d(array).tolist(), array.ndim == 0),
        lambda scalar, rows: np.asarray(rows) if not scalar else np.asarray(rows[0]),
        namespace='numpy1',
    )
    
    optree.register_pytree_node(
        np.ndarray,
        # Children are Python objects
        lambda array: (
            list(array.ravel()),  # list(1DArray[T]) -> List[T]
            dict(shape=array.shape, dtype=array.dtype)
        ),
        lambda metadata, children: np.asarray(children, dtype=metadata['dtype']).reshape(metadata['shape']),
        namespace='numpy2',
    )
    
    optree.register_pytree_node(
        np.ndarray,
        # Returns a list of `np.ndarray`s without termination condition
        lambda array: ([array.ravel()], array.dtype),
        lambda shape, children: children[0].reshape(shape),
        namespace='numpy3',
    )
    
    optree.register_pytree_node(
        torch.Tensor,
        # Children are nest lists of Python objects
        lambda tensor: (torch.atleast_1d(tensor).tolist(), tensor.ndim == 0),
        lambda scalar, rows: torch.tensor(rows) if not scalar else torch.tensor(rows[0])),
        namespace='torch1',
    )
    
    optree.register_pytree_node(
        torch.Tensor,
        # Returns a list of `torch.Tensor`s without termination condition
        lambda tensor: (
            list(tensor.view(-1)),  # list(1DTensor[T]) -> List[0DTensor[T]] (STILL TENSORS!)
            tensor.shape
        ),
        lambda shape, children: torch.stack(children).reshape(shape),
        namespace='torch2',
    )
    
    >>> optree.tree_flatten(np.arange(9).reshape(3, 3), namespace='numpy1')
    (
        [0, 1, 2, 3, 4, 5, 6, 7, 8],
        PyTreeSpec(
            CustomTreeNode(ndarray[False], [[*, *, *], [*, *, *], [*, *, *]]),
            namespace='numpy1'
        )
    )
    # Implicitly casts `float`s to `np.float64`
    >>> optree.tree_map(lambda x: x + 1.5, np.arange(9).reshape(3, 3), namespace='numpy1')
    array([[1.5, 2.5, 3.5],
           [4.5, 5.5, 6.5],
           [7.5, 8.5, 9.5]])
    
    >>> optree.tree_flatten(np.arange(9).reshape(3, 3), namespace='numpy2')
    (
        [0, 1, 2, 3, 4, 5, 6, 7, 8],
        PyTreeSpec(
            CustomTreeNode(ndarray[{'shape': (3, 3), 'dtype': dtype('int64')}], [*, *, *, *, *, *, *, *, *]),
            namespace='numpy2'
        )
    )
    # Explicitly casts `float`s to `np.int64`
    >>> optree.tree_map(lambda x: x + 1.5, np.arange(9).reshape(3, 3), namespace='numpy2')
    array([[1, 2, 3],
           [4, 5, 6],
           [7, 8, 9]])
    
    # Children are also `np.ndarray`s, recurse without termination condition.
    >>> optree.tree_flatten(np.arange(9).reshape(3, 3), namespace='numpy3')
    Traceback (most recent call last):
        ...
    RecursionError: Maximum recursion depth exceeded during flattening the tree.
    
    >>> optree.tree_flatten(torch.arange(9).reshape(3, 3), namespace='torch1')
    (
        [0, 1, 2, 3, 4, 5, 6, 7, 8],
        PyTreeSpec(
            CustomTreeNode(Tensor[False], [[*, *, *], [*, *, *], [*, *, *]]),
            namespace='torch1'
        )
    )
    # Implicitly casts `float`s to `torch.float32`
    >>> optree.tree_map(lambda x: x + 1.5, torch.arange(9).reshape(3, 3), namespace='torch1')
    tensor([[1.5000, 2.5000, 3.5000],
            [4.5000, 5.5000, 6.5000],
            [7.5000, 8.5000, 9.5000]])
    
    # Children are also `torch.Tensor`s, recurse without termination condition.
    >>> optree.tree_flatten(torch.arange(9).reshape(3, 3), namespace='torch2')
    Traceback (most recent call last):
        ...
    RecursionError: Maximum recursion depth exceeded during flattening the tree.
    

None is Non-leaf Node vs. None is Leaf

The None object is a special object in the Python language. It serves some of the same purposes as null (a pointer does not point to anything) in other programming languages, which denotes a variable is empty or marks default parameters. However, the None object is a singleton object rather than a pointer. It may also serve as a sentinel value. In addition, if a function has returned without any return value or the return statement is omitted, the function will also implicitly return the None object.

By default, the None object is considered a non-leaf node in the tree with arity 0, i.e., a non-leaf node that has no children. This is like the behavior of an empty tuple. While flattening a tree, it will remain in the tree structure definitions rather than in the leaves list.

>>> tree = {'b': (2, [3, 4]), 'a': 1, 'c': None, 'd': 5}
>>> optree.tree_flatten(tree)
([1, 2, 3, 4, 5], PyTreeSpec({'a': *, 'b': (*, [*, *]), 'c': None, 'd': *}))
>>> optree.tree_flatten(tree, none_is_leaf=True)
([1, 2, 3, 4, None, 5], PyTreeSpec({'a': *, 'b': (*, [*, *]), 'c': *, 'd': *}, NoneIsLeaf))
>>> optree.tree_flatten(1)
([1], PyTreeSpec(*))
>>> optree.tree_flatten(None)
([], PyTreeSpec(None))
>>> optree.tree_flatten(None, none_is_leaf=True)
([None], PyTreeSpec(*, NoneIsLeaf))

OpTree provides a keyword argument none_is_leaf to determine whether to consider the None object as a leaf, like other opaque objects. If none_is_leaf=True, the None object will place in the leaves list. Otherwise, the None object will remain in the tree specification (structure).

>>> import torch

>>> linear = torch.nn.Linear(in_features=3, out_features=2, bias=False)
>>> linear._parameters  # a container has None
OrderedDict([
    ('weight', Parameter containing:
               tensor([[-0.6677,  0.5209,  0.3295],
                       [-0.4876, -0.3142,  0.1785]], requires_grad=True)),
    ('bias', None)
])

>>> optree.tree_map(torch.zeros_like, linear._parameters)
OrderedDict([
    ('weight', tensor([[0., 0., 0.],
                       [0., 0., 0.]])),
    ('bias', None)
])

>>> optree.tree_map(torch.zeros_like, linear._parameters, none_is_leaf=True)
Traceback (most recent call last):
    ...
TypeError: zeros_like(): argument 'input' (position 1) must be Tensor, not NoneType

>>> optree.tree_map(lambda t: torch.zeros_like(t) if t is not None else 0, linear._parameters, none_is_leaf=True)
OrderedDict([
    ('weight', tensor([[0., 0., 0.],
                       [0., 0., 0.]])),
    ('bias', 0)
])

Key Ordering for Dictionaries

The built-in Python dictionary (i.e., builtins.dict) is an unordered mapping that holds the keys and values. The leaves of a dictionary are the values. Although since Python 3.6, the built-in dictionary is insertion ordered (PEP 468). The dictionary equality operator (==) does not check for key ordering. To ensure referential transparency that "equal dict" implies "equal ordering of leaves", the order of values of the dictionary is sorted by the keys. This behavior is also applied to collections.defaultdict.

>>> optree.tree_flatten({'a': [1, 2], 'b': [3]})
([1, 2, 3], PyTreeSpec({'a': [*, *], 'b': [*]}))
>>> optree.tree_flatten({'b': [3], 'a': [1, 2]})
([1, 2, 3], PyTreeSpec({'a': [*, *], 'b': [*]}))

If users want to keep the values in the insertion order in pytree traversal, they should use collections.OrderedDict, which will take the order of keys under consideration:

>>> OrderedDict([('a', [1, 2]), ('b', [3])]) == OrderedDict([('b', [3]), ('a', [1, 2])])
False
>>> optree.tree_flatten(OrderedDict([('a', [1, 2]), ('b', [3])]))
([1, 2, 3], PyTreeSpec(OrderedDict([('a', [*, *]), ('b', [*])])))
>>> optree.tree_flatten(OrderedDict([('b', [3]), ('a', [1, 2])]))
([3, 1, 2], PyTreeSpec(OrderedDict([('b', [*]), ('a', [*, *])])))

Since OpTree v0.9.0, the key order of the reconstructed output dictionaries from tree_unflatten is guaranteed to be consistent with the key order of the input dictionaries in tree_flatten.

>>> leaves, treespec = optree.tree_flatten({'b': [3], 'a': [1, 2]})
>>> leaves, treespec
([1, 2, 3], PyTreeSpec({'a': [*, *], 'b': [*]}))
>>> optree.tree_unflatten(treespec, leaves)
{'b': [3], 'a': [1, 2]}
>>> optree.tree_map(lambda x: x, {'b': [3], 'a': [1, 2]})
{'b': [3], 'a': [1, 2]}
>>> optree.tree_map(lambda x: x + 1, {'b': [3], 'a': [1, 2]})
{'b': [4], 'a': [2, 3]}

This property is also preserved during serialization/deserialization.

>>> leaves, treespec = optree.tree_flatten({'b': [3], 'a': [1, 2]})
>>> leaves, treespec
([1, 2, 3], PyTreeSpec({'a': [*, *], 'b': [*]}))
>>> restored_treespec = pickle.loads(pickle.dumps(treespec))
>>> optree.tree_unflatten(treespec, leaves)
{'b': [3], 'a': [1, 2]}
>>> optree.tree_unflatten(restored_treespec, leaves)
{'b': [3], 'a': [1, 2]}

Note that there are no restrictions on the dict to require the keys to be comparable (sortable). There can be multiple types of keys in the dictionary. The keys are sorted in ascending order by key=lambda k: k first if capable otherwise fallback to key=lambda k: (f'{k.__class__.__module__}.{k.__class__.__qualname__}', k). This handles most cases.

>>> sorted({1: 2, 1.5: 1}.keys())
[1, 1.5]
>>> sorted({'a': 3, 1: 2, 1.5: 1}.keys())
Traceback (most recent call last):
    ...
TypeError: '<' not supported between instances of 'int' and 'str'
>>> sorted({'a': 3, 1: 2, 1.5: 1}.keys(), key=lambda k: (f'{k.__class__.__module__}.{k.__class__.__qualname__}', k))
[1.5, 1, 'a']

Benchmark

We benchmark the performance of:

  • tree flatten
  • tree unflatten
  • tree copy (i.e., unflatten(flatten(...)))
  • tree map

compared with the following libraries:

Average Time Cost (↓) OpTree (v0.9.0) JAX XLA (v0.4.6) PyTorch (v2.0.0) DM-Tree (v0.1.8)
Tree Flatten x1.00 2.33 22.05 1.12
Tree UnFlatten x1.00 2.69 4.28 16.23
Tree Flatten with Path x1.00 16.16 Not Supported 27.59
Tree Copy x1.00 2.56 9.97 11.02
Tree Map x1.00 2.56 9.58 10.62
Tree Map (nargs) x1.00 2.89 Not Supported 31.33
Tree Map with Path x1.00 7.23 Not Supported 19.66
Tree Map with Path (nargs) x1.00 6.56 Not Supported 29.61

All results are reported on a workstation with an AMD Ryzen 9 5950X CPU @ 4.45GHz in an isolated virtual environment with Python 3.10.9. Run with the following commands:

conda create --name optree-benchmark anaconda::python=3.10 --yes --no-default-packages
conda activate optree-benchmark
python3 -m pip install --editable '.[benchmark]' --extra-index-url https://download.pytorch.org/whl/cpu
python3 benchmark.py --number=10000 --repeat=5

The test inputs are nested containers (i.e., pytrees) extracted from torch.nn.Module objects. They are:

tiny_mlp = nn.Sequential(
    nn.Linear(1, 1, bias=True),
    nn.BatchNorm1d(1, affine=True, track_running_stats=True),
    nn.ReLU(),
    nn.Linear(1, 1, bias=False),
    nn.Sigmoid(),
)

and AlexNet, ResNet18, ResNet34, ResNet50, ResNet101, ResNet152, VisionTransformerH14 (ViT-H/14), and SwinTransformerB (Swin-B) from torchvsion. Please refer to benchmark.py for more details.

Tree Flatten

Module Nodes OpTree (μs) JAX XLA (μs) PyTorch (μs) DM-Tree (μs) Speedup (J / O) Speedup (P / O) Speedup (D / O)
TinyMLP 53 29.70 71.06 583.66 31.32 2.39 19.65 1.05
AlexNet 188 103.92 262.56 2304.36 119.61 2.53 22.17 1.15
ResNet18 698 368.06 852.69 8440.31 420.43 2.32 22.93 1.14
ResNet34 1242 644.96 1461.55 14498.81 712.81 2.27 22.48 1.11
ResNet50 1702 919.95 2080.58 20995.96 1006.42 2.26 22.82 1.09
ResNet101 3317 1806.36 3996.90 40314.12 1955.48 2.21 22.32 1.08
ResNet152 4932 2656.92 5812.38 57775.53 2826.92 2.19 21.75 1.06
ViT-H/14 3420 1863.50 4418.24 41334.64 2128.71 2.37 22.18 1.14
Swin-B 2881 1631.06 3944.13 36131.54 2032.77 2.42 22.15 1.25
Average 2.33 22.05 1.12

Tree UnFlatten

Module Nodes OpTree (μs) JAX XLA (μs) PyTorch (μs) DM-Tree (μs) Speedup (J / O) Speedup (P / O) Speedup (D / O)
TinyMLP 53 55.13 152.07 231.94 940.11 2.76 4.21 17.05
AlexNet 188 226.29 678.29 972.90 4195.04 3.00 4.30 18.54
ResNet18 698 766.54 1953.26 3137.86 12049.88 2.55 4.09 15.72
ResNet34 1242 1309.22 3526.12 5759.16 20966.75 2.69 4.40 16.01
ResNet50 1702 1914.96 5002.83 8369.43 29597.10 2.61 4.37 15.46
ResNet101 3317 3672.61 9633.29 15683.16 57240.20 2.62 4.27 15.59
ResNet152 4932 5407.58 13970.88 23074.68 82072.54 2.58 4.27 15.18
ViT-H/14 3420 4013.18 11146.31 17633.07 66723.58 2.78 4.39 16.63
Swin-B 2881 3595.34 9505.31 15054.88 57310.03 2.64 4.19 15.94
Average 2.69 4.28 16.23

Tree Flatten with Path

Module Nodes OpTree (μs) JAX XLA (μs) PyTorch (μs) DM-Tree (μs) Speedup (J / O) Speedup (P / O) Speedup (D / O)
TinyMLP 53 36.49 543.67 N/A 919.13 14.90 N/A 25.19
AlexNet 188 115.44 2185.21 N/A 3752.11 18.93 N/A 32.50
ResNet18 698 431.84 7106.55 N/A 12286.70 16.46 N/A 28.45
ResNet34 1242 845.61 13431.99 N/A 22860.48 15.88 N/A 27.03
ResNet50 1702 1166.27 18426.52 N/A 31225.05 15.80 N/A 26.77
ResNet101 3317 2312.77 34770.49 N/A 59346.86 15.03 N/A 25.66
ResNet152 4932 3304.74 50557.25 N/A 85847.91 15.30 N/A 25.98
ViT-H/14 3420 2235.25 37473.53 N/A 64105.24 16.76 N/A 28.68
Swin-B 2881 1970.25 32205.83 N/A 55177.50 16.35 N/A 28.01
Average 16.16 N/A 27.59

Tree Copy

Module Nodes OpTree (μs) JAX XLA (μs) PyTorch (μs) DM-Tree (μs) Speedup (J / O) Speedup (P / O) Speedup (D / O)
TinyMLP 53 89.81 232.26 845.20 981.48 2.59 9.41 10.93
AlexNet 188 334.58 959.32 3360.46 4316.05 2.87 10.04 12.90
ResNet18 698 1128.11 2840.71 11471.07 12297.07 2.52 10.17 10.90
ResNet34 1242 2160.57 5333.10 20563.06 21901.91 2.47 9.52 10.14
ResNet50 1702 2746.84 6823.88 29705.99 28927.88 2.48 10.81 10.53
ResNet101 3317 5762.05 13481.45 56968.78 60115.93 2.34 9.89 10.43
ResNet152 4932 8151.21 20805.61 81024.06 84079.57 2.55 9.94 10.31
ViT-H/14 3420 5963.61 15665.91 59813.52 68377.82 2.63 10.03 11.47
Swin-B 2881 5401.59 14255.33 53361.77 62317.07 2.64 9.88 11.54
Average 2.56 9.97 11.02

Tree Map

Module Nodes OpTree (μs) JAX XLA (μs) PyTorch (μs) DM-Tree (μs) Speedup (J / O) Speedup (P / O) Speedup (D / O)
TinyMLP 53 95.13 243.86 867.34 1026.99 2.56 9.12 10.80
AlexNet 188 348.44 987.57 3398.32 4354.81 2.83 9.75 12.50
ResNet18 698 1190.62 2982.66 11719.94 12559.01 2.51 9.84 10.55
ResNet34 1242 2205.87 5417.60 20935.72 22308.51 2.46 9.49 10.11
ResNet50 1702 3128.48 7579.55 30372.71 31638.67 2.42 9.71 10.11
ResNet101 3317 6173.05 14846.57 59167.85 60245.42 2.41 9.58 9.76
ResNet152 4932 8641.22 22000.74 84018.65 86182.21 2.55 9.72 9.97
ViT-H/14 3420 6211.79 17077.49 59790.25 69763.86 2.75 9.63 11.23
Swin-B 2881 5673.66 14339.69 53309.17 59764.61 2.53 9.40 10.53
Average 2.56 9.58 10.62

Tree Map (nargs)

Module Nodes OpTree (μs) JAX XLA (μs) PyTorch (μs) DM-Tree (μs) Speedup (J / O) Speedup (P / O) Speedup (D / O)
TinyMLP 53 137.06 389.96 N/A 3908.77 2.85 N/A 28.52
AlexNet 188 467.24 1496.96 N/A 15395.13 3.20 N/A 32.95
ResNet18 698 1603.79 4534.01 N/A 50323.76 2.83 N/A 31.38
ResNet34 1242 2907.64 8435.33 N/A 90389.23 2.90 N/A 31.09
ResNet50 1702 4183.77 11382.51 N/A 121777.01 2.72 N/A 29.11
ResNet101 3317 7721.13 22247.85 N/A 238755.17 2.88 N/A 30.92
ResNet152 4932 11508.05 31429.39 N/A 360257.74 2.73 N/A 31.30
ViT-H/14 3420 8294.20 24524.86 N/A 270514.87 2.96 N/A 32.61
Swin-B 2881 7074.62 20854.80 N/A 241120.41 2.95 N/A 34.08
Average 2.89 N/A 31.33

Tree Map with Path

Module Nodes OpTree (μs) JAX XLA (μs) PyTorch (μs) DM-Tree (μs) Speedup (J / O) Speedup (P / O) Speedup (D / O)
TinyMLP 53 109.82 778.30 N/A 2186.40 7.09 N/A 19.91
AlexNet 188 365.16 2939.36 N/A 8355.37 8.05 N/A 22.88
ResNet18 698 1308.26 9529.58 N/A 25758.24 7.28 N/A 19.69
ResNet34 1242 2527.21 18084.89 N/A 45942.32 7.16 N/A 18.18
ResNet50 1702 3226.03 22935.53 N/A 61275.34 7.11 N/A 18.99
ResNet101 3317 6663.52 46878.89 N/A 126642.14 7.04 N/A 19.01
ResNet152 4932 9378.19 66136.44 N/A 176981.01 7.05 N/A 18.87
ViT-H/14 3420 7033.69 50418.37 N/A 142508.11 7.17 N/A 20.26
Swin-B 2881 6078.15 43173.22 N/A 116612.71 7.10 N/A 19.19
Average 7.23 N/A 19.66

Tree Map with Path (nargs)

Module Nodes OpTree (μs) JAX XLA (μs) PyTorch (μs) DM-Tree (μs) Speedup (J / O) Speedup (P / O) Speedup (D / O)
TinyMLP 53 146.05 917.00 N/A 3940.61 6.28 N/A 26.98
AlexNet 188 489.27 3560.76 N/A 15434.71 7.28 N/A 31.55
ResNet18 698 1712.79 11171.44 N/A 50219.86 6.52 N/A 29.32
ResNet34 1242 3112.83 21024.58 N/A 95505.71 6.75 N/A 30.68
ResNet50 1702 4220.70 26600.82 N/A 121897.57 6.30 N/A 28.88
ResNet101 3317 8631.34 54372.37 N/A 236555.54 6.30 N/A 27.41
ResNet152 4932 12710.49 77643.13 N/A 353600.32 6.11 N/A 27.82
ViT-H/14 3420 8753.09 58712.71 N/A 286365.36 6.71 N/A 32.72
Swin-B 2881 7359.29 50112.23 N/A 228866.66 6.81 N/A 31.10
Average 6.56 N/A 29.61

Changelog

See CHANGELOG.md.


License

OpTree is released under the Apache License 2.0.

OpTree is heavily based on JAX's implementation of the PyTree utility, with deep refactoring and several improvements. The original licenses can be found at JAX's Apache License 2.0 and Tensorflow's Apache License 2.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optree-0.9.2.tar.gz (98.3 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

optree-0.9.2-cp311-cp311-win_arm64.whl (208.8 kB view details)

Uploaded CPython 3.11Windows ARM64

optree-0.9.2-cp311-cp311-win_amd64.whl (208.8 kB view details)

Uploaded CPython 3.11Windows x86-64

optree-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (320.5 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

optree-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (300.9 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ ARM64

optree-0.9.2-cp311-cp311-macosx_11_0_arm64.whl (258.0 kB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

optree-0.9.2-cp311-cp311-macosx_10_9_x86_64.whl (273.7 kB view details)

Uploaded CPython 3.11macOS 10.9+ x86-64

optree-0.9.2-cp311-cp311-macosx_10_9_universal2.whl (489.3 kB view details)

Uploaded CPython 3.11macOS 10.9+ universal2 (ARM64, x86-64)

optree-0.9.2-cp310-cp310-win_arm64.whl (207.9 kB view details)

Uploaded CPython 3.10Windows ARM64

optree-0.9.2-cp310-cp310-win_amd64.whl (207.9 kB view details)

Uploaded CPython 3.10Windows x86-64

optree-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (319.2 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

optree-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (299.4 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ ARM64

optree-0.9.2-cp310-cp310-macosx_11_0_arm64.whl (256.9 kB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

optree-0.9.2-cp310-cp310-macosx_10_9_x86_64.whl (272.3 kB view details)

Uploaded CPython 3.10macOS 10.9+ x86-64

optree-0.9.2-cp310-cp310-macosx_10_9_universal2.whl (486.5 kB view details)

Uploaded CPython 3.10macOS 10.9+ universal2 (ARM64, x86-64)

optree-0.9.2-cp39-cp39-win_arm64.whl (205.7 kB view details)

Uploaded CPython 3.9Windows ARM64

optree-0.9.2-cp39-cp39-win_amd64.whl (205.7 kB view details)

Uploaded CPython 3.9Windows x86-64

optree-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (319.4 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ x86-64

optree-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (300.6 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ ARM64

optree-0.9.2-cp39-cp39-macosx_11_0_arm64.whl (257.0 kB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

optree-0.9.2-cp39-cp39-macosx_10_9_x86_64.whl (272.3 kB view details)

Uploaded CPython 3.9macOS 10.9+ x86-64

optree-0.9.2-cp39-cp39-macosx_10_9_universal2.whl (486.7 kB view details)

Uploaded CPython 3.9macOS 10.9+ universal2 (ARM64, x86-64)

optree-0.9.2-cp38-cp38-win_amd64.whl (207.8 kB view details)

Uploaded CPython 3.8Windows x86-64

optree-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (319.1 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.17+ x86-64

optree-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (300.5 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.17+ ARM64

optree-0.9.2-cp38-cp38-macosx_11_0_arm64.whl (256.8 kB view details)

Uploaded CPython 3.8macOS 11.0+ ARM64

optree-0.9.2-cp38-cp38-macosx_10_9_x86_64.whl (272.1 kB view details)

Uploaded CPython 3.8macOS 10.9+ x86-64

optree-0.9.2-cp38-cp38-macosx_10_9_universal2.whl (486.4 kB view details)

Uploaded CPython 3.8macOS 10.9+ universal2 (ARM64, x86-64)

optree-0.9.2-cp37-cp37m-win_amd64.whl (207.6 kB view details)

Uploaded CPython 3.7mWindows x86-64

optree-0.9.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (324.7 kB view details)

Uploaded CPython 3.7mmanylinux: glibc 2.17+ x86-64

optree-0.9.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (306.4 kB view details)

Uploaded CPython 3.7mmanylinux: glibc 2.17+ ARM64

optree-0.9.2-cp37-cp37m-macosx_10_9_x86_64.whl (268.3 kB view details)

Uploaded CPython 3.7mmacOS 10.9+ x86-64

File details

Details for the file optree-0.9.2.tar.gz.

File metadata

  • Download URL: optree-0.9.2.tar.gz
  • Upload date:
  • Size: 98.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for optree-0.9.2.tar.gz
Algorithm Hash digest
SHA256 ed0e31f787c274da0cb354687030e0ebbe5a98d2a25876c7c114568fc9fc78d4
MD5 4ed00aaba605aecf78dd78a4760932cb
BLAKE2b-256 cf1232c4295a40e886ba4ba4cce8892a326b2b87a48ff54e367705dc48d90453

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp311-cp311-win_arm64.whl.

File metadata

  • Download URL: optree-0.9.2-cp311-cp311-win_arm64.whl
  • Upload date:
  • Size: 208.8 kB
  • Tags: CPython 3.11, Windows ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for optree-0.9.2-cp311-cp311-win_arm64.whl
Algorithm Hash digest
SHA256 2d2db72152f2b37e5ebaaa9fde57d6c98fa4b94cf076da58dc7afccb9b2c9603
MD5 e4365880b4bd98addcb448e8b8566411
BLAKE2b-256 8826dc867bc9e503296b644140bddd3cb97246dd77959ccc069047a1de4dd700

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: optree-0.9.2-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 208.8 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for optree-0.9.2-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 a33da71027e3ad9d1cc5cb7b2fe13b7cdb2738c92dabc62ab3fdb2c56d59f577
MD5 d66c76a11357fc8f59e103699cfaba9c
BLAKE2b-256 524e944164aba519b5fb76aa14481ed03630fc83a69d4cad4393943ae9b0d9a7

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d8c001d5a0d1eb5882f63bf5d771f7b4bc53a64f6ac5041b86e8e63c196d990d
MD5 af15b79e1eeb024d245140548b841e7e
BLAKE2b-256 6456135839263fe321d30b90ced851dc4d020f5973e7c4d67af42eb530e4f9d0

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 b0986bdd5cbac6ee34c6e3ed9c77246a4f26cfb3c9534fe386533710989ca6e7
MD5 b840f18ef3959d510584eb3c242a4d0b
BLAKE2b-256 1419d56aed95e9968def904f391d57eb542967d92eefbb0caa3caa35ff7fe082

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 a62c9101210aa35036d374fba213172c958b0dd7a6697c4d028c7f6506f86375
MD5 a8b47a30c8f195e68630d285899356f6
BLAKE2b-256 b6804baf589ff0a0369a7c06fd968a3b86681e56e44988942bfac0f0b17d4f99

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 d36783342841704407daa4a66b031f494d1af3def0817b4b6fb4e68a4f7b9272
MD5 47096d3d4b6ea1980c4903ce8800dfb7
BLAKE2b-256 58f9055bdabb778d4b18655c3d5088933bd7d967ae5fb0e95192f1a6d21f1528

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp311-cp311-macosx_10_9_universal2.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp311-cp311-macosx_10_9_universal2.whl
Algorithm Hash digest
SHA256 7b9adab9fd814b923b8475d017e7d8c2a060a830e82edce095a59fd8379d1a6d
MD5 58b7707a842c9fb8c1f57cfad2071be9
BLAKE2b-256 d831d33951b943eee6524942dcd00d002af9181fec210c170ab781906ae9fc52

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp310-cp310-win_arm64.whl.

File metadata

  • Download URL: optree-0.9.2-cp310-cp310-win_arm64.whl
  • Upload date:
  • Size: 207.9 kB
  • Tags: CPython 3.10, Windows ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for optree-0.9.2-cp310-cp310-win_arm64.whl
Algorithm Hash digest
SHA256 c756ebd88cdbbf7902cd7375c589d18022dd966da08eeda9b8ff265ff225993b
MD5 42e49a7d2c0f5ba07dfc091d09b88fc2
BLAKE2b-256 b90cbe1917c4519d7c98ff86c172bb26ac6b53635153e2a69d74e6a5404bac67

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: optree-0.9.2-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 207.9 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for optree-0.9.2-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 956868d0a88b8e6443d4faaf8ba331e600c28666f79235be201a01ebfce3cf53
MD5 a71d7568660bca1a6f7c6ccbf1f9d960
BLAKE2b-256 c1e896c1c1a470577bfaa36e075b6dcfd273b0d84c81c831202682484f229574

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 e947ebe14e2b90e42d17ad7936c874156c324b217babe938544f80ecdbb30689
MD5 9627d333ce308bce49781faa91a39c2f
BLAKE2b-256 2500943332cde04dd2fd15efcd07456271fb1cf382f3c39e591f32b252ea4acf

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 16f1c738b5496d558b4d33f65f13e87d6e49dc291d06da6e77e0ce8ac0a9bdd2
MD5 d87b1dd2e1dbdde142968e353d90dfc8
BLAKE2b-256 44e4332751abb7bfa5549a81c13e9ecf943fbafed775971e702e6dcb35735258

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 211345d153fc0d9b2d1f941777c3827f5f3f3358cd78d65d8e9e5cc0b5d0204b
MD5 6054a8b7458ebf70b3bf5bf4ab507c7a
BLAKE2b-256 aaa248af46031ffc8932e7b229b99d243316dac99d37300950e34761ffcfeda5

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 9338271a48c093a7728e1af973d8547971ebbb1f4e53e9af56da4d6c4338d0c7
MD5 2218926ff72b76589b15ab1e199cc62e
BLAKE2b-256 a0248d3532166305fc0c33da9ee5f888f1c760c2d2f12411e4054e61762baa06

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp310-cp310-macosx_10_9_universal2.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp310-cp310-macosx_10_9_universal2.whl
Algorithm Hash digest
SHA256 1d6d331c8544e1f1571cf69f3be7fb7185e5b4e0441610e9d8b73daa1ce3b2be
MD5 d1fba146e91a9e288e203a6be9745493
BLAKE2b-256 353e3c29ee02da7ff18b29b1fdb3f2a11a8934fa38ab5577b6f398c1e7b4a21c

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp39-cp39-win_arm64.whl.

File metadata

  • Download URL: optree-0.9.2-cp39-cp39-win_arm64.whl
  • Upload date:
  • Size: 205.7 kB
  • Tags: CPython 3.9, Windows ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for optree-0.9.2-cp39-cp39-win_arm64.whl
Algorithm Hash digest
SHA256 75775cd0a4800c14a990e4bd1f72b3f15cdeb11e68324648976dabfae0b01e91
MD5 b309000831c14a2552e868e292605603
BLAKE2b-256 9803d9b61b6539f2ec8cc7d4d46b463fbdc1f64a630f279b6a10e93470622aa9

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: optree-0.9.2-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 205.7 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for optree-0.9.2-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 ff75c688db5a530beafc0592de61c877ae325cdab9f44d791d9082fe2d8e9af3
MD5 9e092af362b3aa67848153b50e5e0a5f
BLAKE2b-256 2a0bed85cd43446e03d7dd404634164ed2e81c32a1727ac1a088b6523356b01d

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5574424b1c40ae451f8567cbe353394667dd06d4937e393f54c0ac96e86b185a
MD5 d39cfff5b145032dd498d8c503275fe0
BLAKE2b-256 0d05db138cea343a541fd12b29d80b396068da96ce9e53af0cc04df0cc99782a

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 d75aead2f7cae2317645d22f6c739503a5d526eec99b91cc689fb22b133d7a9c
MD5 7ee9945dd4b483edd8fed07ea68a3617
BLAKE2b-256 1a5453b1718a60da5d843dd33c84be3f7c20c0c983b154757a76faba641e5597

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 0a86f0fae55ccce78b99e91cf7f0b736aa0961e31138decd23bc223a3ad8f1ea
MD5 3c019d58e35a6a74dad8e3259424a6db
BLAKE2b-256 0851a995ae1abbdf3afc1accd724854fdaacb6b6c5d75c30cc48130185c16517

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 ad944a70b491691d957d60fc064036c2d6173d3b0f5ef17b07955173c7125140
MD5 dfb9e464a7c86661c667a829fa720be1
BLAKE2b-256 faeeb19d7446a48b1af2f3e0a8da05caa1755bdb5edaad42d26bfc0a1ee3704e

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp39-cp39-macosx_10_9_universal2.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp39-cp39-macosx_10_9_universal2.whl
Algorithm Hash digest
SHA256 66bd8dee9e00f4d5c644f993f2d1cd21c16d76fa1cf0dc664eff30053782749c
MD5 daed67b3a6db907adf796a0d66f11df1
BLAKE2b-256 8ae63130f881942839c8566fdd0c7d3dd440c201f3719be8e757ff35f65cefcd

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: optree-0.9.2-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 207.8 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for optree-0.9.2-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 713a1ee422090da07602a63ddac72dd74adb0d233b93c813d7aff165f0717aed
MD5 0abe5eb44814894a0a09b38948abeefb
BLAKE2b-256 c9e5b12b717f7e3228eeec47f667ab4e22fb7027949375a13ca70ad52b9d1968

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 c721608b9afa7e5a198b028e3af63dc9ddcb9b0fad263258373fdcf2b8df6841
MD5 8ece9cf9d385b644554fe4ed7b049e61
BLAKE2b-256 75584ba63ed1067292a578191d2b8aff9bd155ab31f5e045d3a433518df3f250

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 f705c495e8e1c38fa477c22deb36ca9f7aab58096d85dea346365f06b8a80cf4
MD5 9ee7d880d0562c0509fc95a956416154
BLAKE2b-256 198eea04c77e73422a5d773861367e94906f3c12f09c292972e9e95d91519c61

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 08d2ab0744147713d674a864704c529867d1cfa8a94d7a4633b0ee1209735a9f
MD5 20aba496207c3305d20dd457dfcd43e1
BLAKE2b-256 d56879cb28ae0d5fa8ae48856a0dfcda5dc1809a5bb00e3223d1961fbce2f8eb

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 0d1e5677e47e4d1be5db3fab41be851b4d10abc8b26676c761854516864a7630
MD5 c3ffa59d61b4000332d32587da0d92d9
BLAKE2b-256 2b9e2db371acf26e68a38c173c0511668618e214b5e5130399fc1e8d966ad82f

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp38-cp38-macosx_10_9_universal2.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp38-cp38-macosx_10_9_universal2.whl
Algorithm Hash digest
SHA256 a1e26e16ea07d8ed94921ab7410d272cfc81b27dcf17ac198eef4644e56bb44f
MD5 6079c3486ead5922615a449722254b1a
BLAKE2b-256 367bfdef8b378da85344878de9c1b154770f9e034aa9315254a247f657866ba1

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: optree-0.9.2-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 207.6 kB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for optree-0.9.2-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 31f267b24ed0cdb42d1626f035e360848f6536a40ca79a6e1c407823cc7243d2
MD5 4feca807eadcd7db84e71fcf87f26b7d
BLAKE2b-256 b9c2bd91dd5db88f4ab7052938c3b0676064a6650cdd3e591ebb1666627882bb

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 9ad3d7783879fcbc9b32b5ff580bafda9d4eb0b081152f60e249196933adf4b2
MD5 e142ba7eebfccbe69886fda0d58e9978
BLAKE2b-256 591cd29fb0ec236821e5923dc7cbba7df6ac8d4bc9012f4e1a6be76eaf4c7233

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 99b989f71a64b8b7095f48d1819e81070abe2943b658dd2b3fae2e2f529df2cd
MD5 5dfcdbcfb4418acaad1512a7b8681b07
BLAKE2b-256 2d93e1585c6d398939559bf3a680deace0cbe53fbf636166f09d81c0de16b6d0

See more details on using hashes here.

File details

Details for the file optree-0.9.2-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for optree-0.9.2-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 581da8867f5451d5d7d4c5cbce67584ab70e1f70131d46081b4c54d973d7704f
MD5 6ef7848d5fab92b8bf658948d8660f0e
BLAKE2b-256 d5205dc267227ef4e1c68f5b2c4e970ab0398348557f83a503a2264d8e8e6246

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page