Skip to main content

Lightweight data-centric framework for working with scientific data

Project description

DLite

A lightweight data-centric framework for semantic interoperability

CI tests GitHub release PyPi Documentation

Content

DLite is a lightweight interoperability framework, for working with and sharing scientific.

About DLite

DLite is a C implementation of the SINTEF Open Framework and Tools (SOFT), which is a set of concepts and tools for how to efficiently describe and work with scientific data.

All data in DLite is represented by an Instance, which is build on a simple data model. An Instance is identified by a unique UUID and have a set of named dimensions and properties. It is described by its Metadata. In the Metadata, each dimension is given a name and description (optional) and each property is given a name, type, shape (optional), unit (optional) and description (optional). The shape of a property refers to the named dimensions.

When an Instance is instantiated, you must suply a value to the named dimensions. The shape of the properties will be set according to that. This ensures that the shape of the properties are internally consistent.

A Metadata is also an Instance, and hence described by its meta-metadata. By default, DLite defines four levels of metadata; instance, metadata, metadata schema and basic metadata schema. The basic metadata schema describes itself, so no further meta levels are needed. The idea is if two different systems describes their data model in terms of the basic metadata schema, they can easily be made semantically interoperable.

The datamodel of DLite.

An alternative and more flexible way to enable interoperability is to use a common ontology. DLite provides a specialised Instance called Collection. A collection is essentially a container holding a set of Instances and relations between them. But it can also relate an Instance or even a dimension or property of an instance to a concept in an ontology. DLite allows to transparently map an Instance whos Metadata corresponding to a concept in one ontology to an Instance whos Metadata corresponding to a concept in another ontology. Such mappings can easily be registered (in C or Python) and reused, providing a very powerful system for achieving interoperability.

DLite provides also a common and extendable API for loading/storing Instances from/to different storages. New storage plugins can be written in C or Python.

See doc/concepts.md for more details.

DLite is licensed under the MIT license.

Example

Lets say that you have the following Python class

class Person:
    def __init__(self, name, age, skills):
        self.name = name
        self.age = age
        self.skills = skills

that you want to describe semantically. We do that by defining the following metadata (using json) identifying the Python attributes with dlite properties. Here we define name to be a string, age to be a float and skills to be an array of N strings, where N is a name of a dimension. The metadata uniquely identifies itself with the "name", "version" and "namespace" fields and "meta" refers the the metadata schema (meta-metadata) that this metadata is described by. Finally are human description of the metadata itself, its dimensions and its properties provide in the "description" fields.

{
  "name": "Person",
  "version": "0.1",
  "namespace": "http://onto-ns.com/meta",
  "meta": "http://onto-ns.com/meta/0.3/EntitySchema",
  "description": "A person.",
  "dimensions": [
    {
      "name": "N",
      "description": "Number of skills."
    }
  ],
  "properties": [
    {
      "name": "name",
      "type": "string",
      "description": "Full name."
    },
    {
      "name": "age",
      "type": "float",
      "unit": "year",
      "description": "Age of person."
    },
    {
      "name": "skills",
      "type": "string",
      "dims": ["N"],
      "description": "List of skills."
    }
  ]
}

We save the metadata in file "Person.json". Back in Python we can now make a dlite-aware subclass of Person, instantiate it and serialise it to a storage:

import dlite

# Create a dlite-aware subclass of Person
DLitePerson = dlite.classfactory(Person, url='json://Person.json')

# Instantiate
person = DLitePerson('Sherlock Holmes', 34., ['observing', 'chemistry',
    'violin', 'boxing'])

# Write to storage (here a json file)
person.dlite_inst.save('json://homes.json?mode=w')

To access this new instance from C, you can first generate a header file from the meta data

$ dlite-codegen -f c-header -o person.h Person.json

and then include it in your C program:

// homes.c -- sample program that loads instance from homes.json and prints it
#include <stdio.h>
#include <dlite.h>
#include "person.h"  // header generated with dlite-codegen

int main()
{
  /* URL of instance to load using the json driver.  The storage is
     here the file 'homes.json' and the instance we want to load in
     this file is identified with the UUID following the hash (#)
     sign. */
  char *url = "json://homes.json#315088f2-6ebd-4c53-b825-7a6ae5c9659b";

  Person *person = (Person *)dlite_instance_load_url(url);

  int i;
  printf("name:  %s\n", person->name);
  printf("age:   %g\n", person->age);
  printf("skills:\n");
  for (i=0; i<person->N; i++)
    printf("  - %s\n", person->skills[i]);

  return 0;
}

Now run the python file and it would create a homes.json file, which contains an entity information. Use the UUID of the entity from the homes.json file, and update the url variable in the homes.c file.

Since we are using dlite_instance_load_url() to load the instance, you must link to dlite when compiling this program. Assuming you are using Linux and dlite in installed in $HOME/.local, compiling with gcc would look like:

$ gcc homes.c -o homes -I$HOME/.local/include/dlite -L$HOME/.local/lib -ldlite -ldlite-utils

Or if you are using the development environment , you can compile using:

$ gcc -I/tmp/dlite-install/include/dlite -L/tmp/dlite-install/lib -o homes homes.c -ldlite -ldlite-utils

Finally you can run the program with

$ DLITE_STORAGES=*.json ./homes
name:  Sherlock Holmes
age:   34
skills:
  - observing
  - chemistry
  - violin
  - boxing

Note that we in this case have to define the environment variable DLITE_STORAGES in order to let dlite find the metadata we stored in 'Person.json'. There are ways to avoid this, e.g. by hardcoding the metadata in C using dlite-codegen -f c-source or in the C program explicitely load 'Person.json' before 'homes.json'.

This was just a brief example. There is much more to dlite. Since the documentation is still not complete, the best source is the code itself, including the tests and examples.

Main features

See doc/features.md for a more detailed list.

  • Enables semantic interoperability via simple formalised metadata and data
  • Metadata can be linked to or generated from ontologies
  • Code generation for simple integration in existing code bases
  • Plugin API for data storages (json, hdf5, rdf, yaml, postgresql, blob, csv...)
  • Plugin API for mapping between metadata
  • Bindings to C, Python and Fortran

Installing DLite

Installing with pip

If you are using Python, the easiest way to install DLite is with pip:

pip install DLite-Python

Note, currently only Linux versions for Python 3.7, 3.8, 3.9 and 3.10 are available. But Windows versions will soon be available.

Docker image

A docker image is available on https://github.com/SINTEF/dlite/packages.

Compile from sources

The sources can be cloned from GitHub

git clone ssh://git@git.code.sintef.no/sidase/dlite.git

Dependencies

Runtime dependencies

  • HDF5, optional (needed by HDF5 storage plugin)
  • librdf, optional (needed by RDF (Redland) storage plugin)
  • Python 3, optional (needed by Python bindings and some plugins)
    • NumPy, required if Python is enabled
    • PyYAML, optional (used for generic YAML storage plugin)
    • psycopg2, optional (used for generic PostgreSQL storage plugin)
      Note that in some cases a GSSAPI error is raised when using psycopg2 by pip installing psycopg2-binary. This is solved by installing from source as described in their documentation.
    • pandas, optional (used for csv storage plugin)

Build dependencies

  • cmake, required for building
  • hdf5 development libraries, optional (needed by HDF5 storage plugin)
  • librdf development libraries, optional (needed by librdf storage plugin)
  • Python 3 development libraries, optional (needed by Python bindings)
  • NumPy development libraries, optional (needed by Python bindings)
  • SWIG v3, optional (needed by building Python bindings)
  • Doxygen, optional, used for documentation generation
  • valgrind, optional, used for memory checking (Linux only)
  • cppcheck, optional, used for static code analysis

Compiling

Build and install with Python

Given you have a C compiler and Python correctly installed, you should be able to build and install dlite via the python/setup.py script:

cd python
python setup.py install

Build on Linux

Install dependencies (e.g. with apt-get install on Ubuntu or dnf install on Fedora)

Configure the build with:

mkdir build
cd build
cmake ..

Configuration options can be added to the cmake command. For example, you can change the installation directory by adding -DCMAKE_INSTALL_PREFIX=/path/to/new/install/dir. The default is ~/.local.

Alternatively, you can configure configuration options with ccmake ...

If you use virtual environments for Python, you should activate your environment before running cmake and set CMAKE_INSTALL_PREFIX to the directory of the virtual environment. For example:

VIRTUAL_ENV=/path/to/virtual/env
$VIRTUAL_ENV/bin/activate
cmake -DCMAKE_INSTALL_PREFIX=$VIRTUAL_ENV

Build with:

make

To run the tests, do

ctest            # same as running `ctest`
make memcheck    # runs all tests with memory checking (requires
                 # valgrind)

To generate code documentation, do

make doc         # direct your browser to build/doc/html/index.html

To install dlite locally, do

make install

Build with VS Code on Windows

See here for detailed instructions for building with Visual Studio.

Quick start with VS Code and Remote Container

Using Visual Studio Code it is possible to do development on the system defined in Dockerfile.

  1. Download and install Visual Studio Code.
  2. Install the extension Remote Development.
  3. Clone dlite and initialize git modules: git submodule update --init.
  4. Open the dlite folder with VS Code.
  5. Start VS Code, run the Remote-Containers: Open Folder in Container... command from the Command Palette (F1) or quick actions Status bar item. This will build the container and restart VS Code in it. This may take some time the first time as the Docker image must be built. See Quick start: Open an existing folder in a container for more information and instructions.
  6. In the container terminal, perform the first build and tests with mkdir /workspace/build; cd /workspace/build; cmake ../dlite; make && make test.

Build documentation

If you have doxygen installed, the html documentation should be generated as a part of the build process. It can be browsed by opening the following file in your browser:

<build>/doc/html/index.html

where <build> is your build folder. To only build the documentation, you can do:

cd build
cmake --build . --target doc

If you have LaTeX and make installed, you can also the latex documentation with

cd build
cmake --build . --target latex

which will produce the file

<build>/doc/latex/refman.pdf

Setting up the environment

If dlite is installed in a non-default location, you may need to set the PATH, LD_LIBRARY_PATH, PYTHONPATH and DLITE_ROOT environment variables. See the documentation of environment variables for more details.

An example of how to use dlite is shown above. See also the examples in the examples directory for how to link to dlite from C and use of the Fortran bindings.

Short vocabulary

The following terms have a special meaning in dlite:

  • Basic metadata schema: Toplevel meta-metadata which describes itself.
  • Collection: A specialised instance that contains references to set of instances and relations between them. Within a collection instances are labeled. See also the SOFT5 nomenclauture.
  • Data instance: A "leaf" instance that is not metadata.
  • Entity: May be any kind of instance, including data instances, metadata instances or meta-metadata instances. However, for historical reasons it is often used for "standard" metadata that are instances of meta-metadata "http://onto-ns.com/meta/0.3/EntitySchema".
  • Instance: The basic data object in DLite. All instances are described by their metadata which itself are instances. Instances are identified by an UUID.
  • Mapping: A function that maps one or more input instances to an output instance. They are an important mechanism for interoperability. Mappings are called translators in SOFT5.
  • Metadata: a special type of instances that describe other instances. All metadata are immutable and has an unique URI in addition to their UUID.
  • Meta-metadata: metadata that describes metadata.
  • Relation: A subject-predicate-object triplet. Relations are immutable.
  • Storage: A generic handle encapsulating actual storage backends.
  • Transaction: A not yet implemented feature, that enables to represent the evolution of the state of a software as a series of immutable instances. See also the SOFT5 nomenclauture.
  • uri: A uniform resource identifier (URI) is a generalisation of URL, but follows the same syntax rules. In dlite, the term "uri" is used as an human readable identifier for instances (optional for data instances) and has the form namespace/version/name.
  • url: A uniform resource locator (URL) is an reference to a web resource, like a file (on a given computer), database entry, web page, etc. In dlite url's refer to a storage or even an specific instance in a storage using the general syntax driver://location?options#fragment, where options and fragment are optional. If fragment is provided, it should be the uuid or uri of an instance.
  • uuid: A universal unique identifier (UUID) is commonly used to uniquely identify digital information. DLite uses the 36 character string representation of uuid's to uniquely identify instances. The uuid is generated from the uri for instances that has an uri, otherwise it is randomly generated.

Developer documentation

License

DLite is licensed under the MIT license. However, it include a few third party source files with other permissive licenses. All of these should allow dynamic and static linking against open and propritary codes. A full list of included licenses can be found in LICENSES.txt.

Acknowledgment

In addition from internal funding from SINTEF and NTNU this work has been supported by several projects, including:

  • AMPERE (2015-2020) funded by Forskningsrådet and Norwegian industry partners.
  • FICAL (2015-2020) funded by Forskningsrådet and Norwegian industry partners.
  • SFI Manufacturing (2015-2023) funded by Forskningsrådet and Norwegian industry partners.
  • SFI PhysMet(2020-2028) funded by Forskningsrådet and Norwegian industry partners.
  • OntoTrans (2020-2024) that receives funding from the European Union’s Horizon 2020 Research and Innovation Programme, under Grant Agreement n. 862136.
  • OpenModel (2021-2025) that receives funding from the European Union’s Horizon 2020 Research and Innovation Programme, under Grant Agreement n. 953167.
  • DOME 4.0 (2021-2025) that receives funding from the European Union’s Horizon 2020 Research and Innovation Programme, under Grant Agreement n. 953163.
  • VIPCOAT (2021-2025) that receives funding from the European Union’s Horizon 2020 Research and Innovation Programme, under Grant Agreement n. 952903.

DLite is developed with the hope that it will be a delight to work with.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

DLite-Python-0.3.14.tar.gz (22.8 kB view details)

Uploaded Source

Built Distributions

DLite_Python-0.3.14-cp310-cp310-win_amd64.whl (322.5 kB view details)

Uploaded CPython 3.10 Windows x86-64

DLite_Python-0.3.14-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (15.9 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

DLite_Python-0.3.14-cp39-cp39-win_amd64.whl (322.5 kB view details)

Uploaded CPython 3.9 Windows x86-64

DLite_Python-0.3.14-cp39-cp39-musllinux_1_1_i686.whl (339.5 kB view details)

Uploaded CPython 3.9 musllinux: musl 1.1+ i686

DLite_Python-0.3.14-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (15.9 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

DLite_Python-0.3.14-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl (15.6 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ i686

DLite_Python-0.3.14-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (6.9 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.12+ x86-64

DLite_Python-0.3.14-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl (7.0 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.12+ i686

DLite_Python-0.3.14-cp38-cp38-win_amd64.whl (322.6 kB view details)

Uploaded CPython 3.8 Windows x86-64

DLite_Python-0.3.14-cp38-cp38-musllinux_1_1_i686.whl (339.4 kB view details)

Uploaded CPython 3.8 musllinux: musl 1.1+ i686

DLite_Python-0.3.14-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (15.9 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

DLite_Python-0.3.14-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl (15.6 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ i686

DLite_Python-0.3.14-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (6.9 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ x86-64

DLite_Python-0.3.14-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl (7.0 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ i686

DLite_Python-0.3.14-cp37-cp37m-win_amd64.whl (321.5 kB view details)

Uploaded CPython 3.7m Windows x86-64

DLite_Python-0.3.14-cp37-cp37m-musllinux_1_1_i686.whl (339.7 kB view details)

Uploaded CPython 3.7m musllinux: musl 1.1+ i686

DLite_Python-0.3.14-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (15.9 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

DLite_Python-0.3.14-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl (15.6 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ i686

DLite_Python-0.3.14-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (6.9 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64

DLite_Python-0.3.14-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl (7.0 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ i686

File details

Details for the file DLite-Python-0.3.14.tar.gz.

File metadata

  • Download URL: DLite-Python-0.3.14.tar.gz
  • Upload date:
  • Size: 22.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for DLite-Python-0.3.14.tar.gz
Algorithm Hash digest
SHA256 232d8b5972671eb413a9925879689fdcd211918154f919d781fd3c23592cd125
MD5 30870a5d1183d9bc46770a1a0e0da80f
BLAKE2b-256 3eaea7eb25e5da61e6dfb5ea3d32d70e7ff8244eae485e436e0017039c71a6d3

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 dddd58ec2915d756fae6b289f55e450d61e02b2b6c15d91ff01dc6642549123b
MD5 28446aa3491e50b6166e5da1322ceeae
BLAKE2b-256 a61967cc7ea9f9bbab6d8938d295493a4a8f257897a8b7d31a353bd001fd360e

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 418799011c4ade007df644f1bdb50ed337ade15253b1d946a7719f2d8675cfe9
MD5 c515000d4b107c31f932225d814fbe4d
BLAKE2b-256 780ef52f691b82b31e5e9b556079f3e32745024a42caf16d00aed9c848ce9c99

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 e5b4f935b3bdb6eecc4d89db7e7c4e1ab5a2d59c608437b61b2836435973d42b
MD5 3e96a2d3954931036324533e2836ed54
BLAKE2b-256 3812ae1a03bb2bcbbf73e5915abfb6ae21db4c02327920844daa3e2ef659f6d8

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp39-cp39-musllinux_1_1_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp39-cp39-musllinux_1_1_i686.whl
Algorithm Hash digest
SHA256 03d0e0ec921456212462b15311ed431c151d45f88c172ad9a9e19f9acdf132b2
MD5 9ea2666e870444d0e8537e0a85d773c1
BLAKE2b-256 fdccdb8aa105412a675fc027fa7b64827fce3e8f4bca7135e2b8c17176d859d9

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 9e03b0eee09604c6bb3658e30f01a599853bfbc0db7d7f2727b62fa1cf75cc3a
MD5 5e27286fb89343381e67f8e52ff8adf6
BLAKE2b-256 a11c0a127eb8a1f92e8e933c9f5b75bc02fc18864ddc5737e8c7da581a458b04

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 2d7c48bcf609529e607f8b8f834ac3b20b18918ee43135bf32166cccaca989de
MD5 aab23c69dd2c2db7bee5c55044a15b7e
BLAKE2b-256 b0c9834e2f86dc40ceaa754c8a4bb800ec35082dce594e1d1d9a4f115321813a

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 5544700f6124625512edf75fc69f3d182dfd36376759f6034f6e97761047dabb
MD5 ecdbc6b073ac7a921a12223d65be2c46
BLAKE2b-256 a82948decc88c840c9771c298a1c6d081b756c128ea7f91ddfe364b575fd48cc

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl
Algorithm Hash digest
SHA256 ac50ef6681efa140ecbf2fe6129d940ab3e49af92415838411a54cdc3bd11ae2
MD5 530a23cd00587169dda7908af08ef1d8
BLAKE2b-256 0ecd0836d72de6209f66cf81945bebcadc25ded7001bab4836e1fadf025b3cf1

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 0068d10955e6333a39af5c320505c66bf0543b679117a7214ac462d1545be287
MD5 ef7eece5d3778f3de81ecfbbd42f1556
BLAKE2b-256 886e20cc6a327747c6bd76897a40472f69d7aa188d896c80ac4ccac6eef97c8c

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp38-cp38-musllinux_1_1_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp38-cp38-musllinux_1_1_i686.whl
Algorithm Hash digest
SHA256 89f1bbb67e203dbe14b102115c4549e644e42fa98bb5b10d7dc4facfb56cb75b
MD5 82b053711aa42a35ea2336038ee943d7
BLAKE2b-256 ddbbed71aeade19fed9717cf2905eab52628bda834bbe6d0c58ff86cd0b78519

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a78681d2a8405841efd57baaee54bac17ef762a54b6b1db30d5c5109fe0395bb
MD5 93af4816fd0add01e86e76897e020216
BLAKE2b-256 ce8ef8a7984c4060500176106c28e3d657799a7f1d429f0f281778bc2b7eea34

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 a86588ed3c8306bbdc048eac96362c42190cd603e4447a82f1ae9dfd62cb1059
MD5 785a5bd575eab2173315dd7ba667a638
BLAKE2b-256 d312bc083d830c0ce2b9458de51de690c4f10d3a7b0a13343ed657c1dbaf8843

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 ef93d794cd39b7d6e523b4dad47623e373745f764e126aa9c7a1142b9a7dee62
MD5 c51ad797e1d3ae17568813efd5fe0cf7
BLAKE2b-256 40b1ba111a6eb49bba67987ebdde0f53a4ca10c41ebc0991034495dd9f06449e

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl
Algorithm Hash digest
SHA256 ae1a92ffde29bd26adf317e58a6766c7676e5a0d3abdf6953b89161d059aa224
MD5 a78e7f66a202f1644c2254e32b5c2fe9
BLAKE2b-256 fa391995f2739312b39c5623139432446c57807ef9c1cba67257b54104246aff

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp37-cp37m-win_amd64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 5cb9af8e6aa38ee9a96f5bd1c8c3cb37c69748306034c030fe344e3d2de0cb98
MD5 cd4893acea1e0d317802dc4b1fedbc94
BLAKE2b-256 39352d32505a252e83cd0564cbdca51755dda78ea21e93dda8b711fdb58a34fb

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp37-cp37m-musllinux_1_1_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp37-cp37m-musllinux_1_1_i686.whl
Algorithm Hash digest
SHA256 e6f873776fa7523572f158d8a9da306cb773cea7ae6cca5026b9fd0059e9daf3
MD5 927ff8dc11583557770598461c54cec9
BLAKE2b-256 9c25e2a29835025132ac4459d0a3e7fd8a231cc0a6c7106b97d74a6e52ffee35

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 2db6f93ec8676c747d9f1fdb5077bb3a3e933c80dba48dffee7eeaa1cb368177
MD5 77130a384106bb55ae49a1d230c17872
BLAKE2b-256 0bbd029e464d26fc01fbc1df1903003c16d675cc8d097eb8b120c65f3bf2d736

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 de37bda34017902bd3698a5b5847ce098ec27af86b1c21d7c7435d3ce84627ab
MD5 0a99ed3425b0b67b698f1694ad2ba10e
BLAKE2b-256 51a7b8fef9c070f63cfe57db6a1a8a9374dbedf81861a7d73609f892ce8b4c81

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 ef038f8a887e149aba385198886bb29b25581d61feef4e48790c20872fc979ae
MD5 0afed7720b477c5e6a51e4a3a956dead
BLAKE2b-256 f2fee7c2e7f1821cd9e51e7ba468d8a064ea5e306205f287037493c2272d9698

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.14-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.14-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl
Algorithm Hash digest
SHA256 a1212b322d9e618ba029ed6b8720135d33c1b3ed57ca0802f010a384a0366ecb
MD5 f07bd9d038e7e4b0e7433f38bc1ac068
BLAKE2b-256 cf8c45482a4a89b8dd0f727016b9efdaa1f561f8ea363f1650b56a9e97684276

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page