Skip to main content

Lightweight data-centric framework for working with scientific data

Project description

DLite

A lightweight data-centric framework for semantic interoperability

CI tests GitHub release PyPi Documentation

Content

DLite is a lightweight interoperability framework, for working with and sharing scientific.

About DLite

DLite is a C implementation of the SINTEF Open Framework and Tools (SOFT), which is a set of concepts and tools for how to efficiently describe and work with scientific data.

All data in DLite is represented by an Instance, which is build on a simple data model. An Instance is identified by a unique UUID and have a set of named dimensions and properties. It is described by its Metadata. In the Metadata, each dimension is given a name and description (optional) and each property is given a name, type, shape (optional), unit (optional) and description (optional). The shape of a property refers to the named dimensions.

When an Instance is instantiated, you must suply a value to the named dimensions. The shape of the properties will be set according to that. This ensures that the shape of the properties are internally consistent.

A Metadata is also an Instance, and hence described by its meta-metadata. By default, DLite defines four levels of metadata; instance, metadata, metadata schema and basic metadata schema. The basic metadata schema describes itself, so no further meta levels are needed. The idea is if two different systems describes their data model in terms of the basic metadata schema, they can easily be made semantically interoperable.

The datamodel of DLite.

An alternative and more flexible way to enable interoperability is to use a common ontology. DLite provides a specialised Instance called Collection. A collection is essentially a container holding a set of Instances and relations between them. But it can also relate an Instance or even a dimension or property of an instance to a concept in an ontology. DLite allows to transparently map an Instance whos Metadata corresponding to a concept in one ontology to an Instance whos Metadata corresponding to a concept in another ontology. Such mappings can easily be registered (in C or Python) and reused, providing a very powerful system for achieving interoperability.

DLite provides also a common and extendable API for loading/storing Instances from/to different storages. New storage plugins can be written in C or Python.

See doc/concepts.md for more details.

DLite is licensed under the MIT license.

Example

Lets say that you have the following Python class

class Person:
    def __init__(self, name, age, skills):
        self.name = name
        self.age = age
        self.skills = skills

that you want to describe semantically. We do that by defining the following metadata (using json) identifying the Python attributes with dlite properties. Here we define name to be a string, age to be a float and skills to be an array of N strings, where N is a name of a dimension. The metadata uniquely identifies itself with the "name", "version" and "namespace" fields and "meta" refers the the metadata schema (meta-metadata) that this metadata is described by. Finally are human description of the metadata itself, its dimensions and its properties provide in the "description" fields.

{
  "name": "Person",
  "version": "0.1",
  "namespace": "http://onto-ns.com/meta",
  "meta": "http://onto-ns.com/meta/0.3/EntitySchema",
  "description": "A person.",
  "dimensions": [
    {
      "name": "N",
      "description": "Number of skills."
    }
  ],
  "properties": [
    {
      "name": "name",
      "type": "string",
      "description": "Full name."
    },
    {
      "name": "age",
      "type": "float",
      "unit": "year",
      "description": "Age of person."
    },
    {
      "name": "skills",
      "type": "string",
      "dims": ["N"],
      "description": "List of skills."
    }
  ]
}

We save the metadata in file "Person.json". Back in Python we can now make a dlite-aware subclass of Person, instantiate it and serialise it to a storage:

import dlite

# Create a dlite-aware subclass of Person
DLitePerson = dlite.classfactory(Person, url='json://Person.json')

# Instantiate
person = DLitePerson('Sherlock Holmes', 34., ['observing', 'chemistry',
    'violin', 'boxing'])

# Write to storage (here a json file)
person.dlite_inst.save('json://homes.json?mode=w')

To access this new instance from C, you can first generate a header file from the meta data

$ dlite-codegen -f c-header -o person.h Person.json

and then include it in your C program:

// homes.c -- sample program that loads instance from homes.json and prints it
#include <stdio.h>
#include <dlite.h>
#include "person.h"  // header generated with dlite-codegen

int main()
{
  /* URL of instance to load using the json driver.  The storage is
     here the file 'homes.json' and the instance we want to load in
     this file is identified with the UUID following the hash (#)
     sign. */
  char *url = "json://homes.json#315088f2-6ebd-4c53-b825-7a6ae5c9659b";

  Person *person = (Person *)dlite_instance_load_url(url);

  int i;
  printf("name:  %s\n", person->name);
  printf("age:   %g\n", person->age);
  printf("skills:\n");
  for (i=0; i<person->N; i++)
    printf("  - %s\n", person->skills[i]);

  return 0;
}

Now run the python file and it would create a homes.json file, which contains an entity information. Use the UUID of the entity from the homes.json file, and update the url variable in the homes.c file.

Since we are using dlite_instance_load_url() to load the instance, you must link to dlite when compiling this program. Assuming you are using Linux and dlite in installed in $HOME/.local, compiling with gcc would look like:

$ gcc homes.c -o homes -I$HOME/.local/include/dlite -L$HOME/.local/lib -ldlite -ldlite-utils

Or if you are using the development environment , you can compile using:

$ gcc -I/tmp/dlite-install/include/dlite -L/tmp/dlite-install/lib -o homes homes.c -ldlite -ldlite-utils

Finally you can run the program with

$ DLITE_STORAGES=*.json ./homes
name:  Sherlock Holmes
age:   34
skills:
  - observing
  - chemistry
  - violin
  - boxing

Note that we in this case have to define the environment variable DLITE_STORAGES in order to let dlite find the metadata we stored in 'Person.json'. There are ways to avoid this, e.g. by hardcoding the metadata in C using dlite-codegen -f c-source or in the C program explicitely load 'Person.json' before 'homes.json'.

This was just a brief example. There is much more to dlite. Since the documentation is still not complete, the best source is the code itself, including the tests and examples.

Main features

See doc/features.md for a more detailed list.

  • Enables semantic interoperability via simple formalised metadata and data
  • Metadata can be linked to or generated from ontologies
  • Code generation for simple integration in existing code bases
  • Plugin API for data storages (json, hdf5, rdf, yaml, postgresql, blob, csv...)
  • Plugin API for mapping between metadata
  • Bindings to C, Python and Fortran

Installing DLite

Installing with pip

If you are using Python, the easiest way to install DLite is with pip:

pip install DLite-Python

Note, currently only Linux versions for Python 3.7, 3.8, 3.9 and 3.10 are available. But Windows versions will soon be available.

Docker image

A docker image is available on https://github.com/SINTEF/dlite/packages.

Compile from sources

The sources can be cloned from GitHub

git clone ssh://git@git.code.sintef.no/sidase/dlite.git

Dependencies

Runtime dependencies

  • HDF5, optional (needed by HDF5 storage plugin)
  • librdf, optional (needed by RDF (Redland) storage plugin)
  • Python 3, optional (needed by Python bindings and some plugins)
    • NumPy, required if Python is enabled
    • PyYAML, optional (used for generic YAML storage plugin)
    • psycopg2, optional (used for generic PostgreSQL storage plugin)
      Note that in some cases a GSSAPI error is raised when using psycopg2 by pip installing psycopg2-binary. This is solved by installing from source as described in their documentation.
    • pandas, optional (used for csv storage plugin)

Build dependencies

  • cmake, required for building
  • hdf5 development libraries, optional (needed by HDF5 storage plugin)
  • librdf development libraries, optional (needed by librdf storage plugin)
  • Python 3 development libraries, optional (needed by Python bindings)
  • NumPy development libraries, optional (needed by Python bindings)
  • SWIG v3, optional (needed by building Python bindings)
  • Doxygen, optional, used for documentation generation
  • valgrind, optional, used for memory checking (Linux only)
  • cppcheck, optional, used for static code analysis

Compiling

Build and install with Python

Given you have a C compiler and Python correctly installed, you should be able to build and install dlite via the python/setup.py script:

cd python
python setup.py install

Build on Linux

Install dependencies (e.g. with apt-get install on Ubuntu or dnf install on Fedora)

Configure the build with:

mkdir build
cd build
cmake ..

Configuration options can be added to the cmake command. For example, you can change the installation directory by adding -DCMAKE_INSTALL_PREFIX=/path/to/new/install/dir. The default is ~/.local.

Alternatively, you can configure configuration options with ccmake ...

If you use virtual environments for Python, you should activate your environment before running cmake and set CMAKE_INSTALL_PREFIX to the directory of the virtual environment. For example:

VIRTUAL_ENV=/path/to/virtual/env
$VIRTUAL_ENV/bin/activate
cmake -DCMAKE_INSTALL_PREFIX=$VIRTUAL_ENV

Build with:

make

To run the tests, do

ctest            # same as running `ctest`
make memcheck    # runs all tests with memory checking (requires
                 # valgrind)

To generate code documentation, do

make doc         # direct your browser to build/doc/html/index.html

To install dlite locally, do

make install

Build with VS Code on Windows

See here for detailed instructions for building with Visual Studio.

Quick start with VS Code and Remote Container

Using Visual Studio Code it is possible to do development on the system defined in Dockerfile.

  1. Download and install Visual Studio Code.
  2. Install the extension Remote Development.
  3. Clone dlite and initialize git modules: git submodule update --init.
  4. Open the dlite folder with VS Code.
  5. Start VS Code, run the Remote-Containers: Open Folder in Container... command from the Command Palette (F1) or quick actions Status bar item. This will build the container and restart VS Code in it. This may take some time the first time as the Docker image must be built. See Quick start: Open an existing folder in a container for more information and instructions.
  6. In the container terminal, perform the first build and tests with mkdir /workspace/build; cd /workspace/build; cmake ../dlite; make && make test.

Build documentation

If you have doxygen installed, the html documentation should be generated as a part of the build process. It can be browsed by opening the following file in your browser:

<build>/doc/html/index.html

where <build> is your build folder. To only build the documentation, you can do:

cd build
cmake --build . --target doc

If you have LaTeX and make installed, you can also the latex documentation with

cd build
cmake --build . --target latex

which will produce the file

<build>/doc/latex/refman.pdf

Setting up the environment

If dlite is installed in a non-default location, you may need to set the PATH, LD_LIBRARY_PATH, PYTHONPATH and DLITE_ROOT environment variables. See the documentation of environment variables for more details.

An example of how to use dlite is shown above. See also the examples in the examples directory for how to link to dlite from C and use of the Fortran bindings.

Short vocabulary

The following terms have a special meaning in dlite:

  • Basic metadata schema: Toplevel meta-metadata which describes itself.
  • Collection: A specialised instance that contains references to set of instances and relations between them. Within a collection instances are labeled. See also the SOFT5 nomenclauture.
  • Data instance: A "leaf" instance that is not metadata.
  • Entity: May be any kind of instance, including data instances, metadata instances or meta-metadata instances. However, for historical reasons it is often used for "standard" metadata that are instances of meta-metadata "http://onto-ns.com/meta/0.3/EntitySchema".
  • Instance: The basic data object in DLite. All instances are described by their metadata which itself are instances. Instances are identified by an UUID.
  • Mapping: A function that maps one or more input instances to an output instance. They are an important mechanism for interoperability. Mappings are called translators in SOFT5.
  • Metadata: a special type of instances that describe other instances. All metadata are immutable and has an unique URI in addition to their UUID.
  • Meta-metadata: metadata that describes metadata.
  • Relation: A subject-predicate-object triplet. Relations are immutable.
  • Storage: A generic handle encapsulating actual storage backends.
  • Transaction: A not yet implemented feature, that enables to represent the evolution of the state of a software as a series of immutable instances. See also the SOFT5 nomenclauture.
  • uri: A uniform resource identifier (URI) is a generalisation of URL, but follows the same syntax rules. In dlite, the term "uri" is used as an human readable identifier for instances (optional for data instances) and has the form namespace/version/name.
  • url: A uniform resource locator (URL) is an reference to a web resource, like a file (on a given computer), database entry, web page, etc. In dlite url's refer to a storage or even an specific instance in a storage using the general syntax driver://location?options#fragment, where options and fragment are optional. If fragment is provided, it should be the uuid or uri of an instance.
  • uuid: A universal unique identifier (UUID) is commonly used to uniquely identify digital information. DLite uses the 36 character string representation of uuid's to uniquely identify instances. The uuid is generated from the uri for instances that has an uri, otherwise it is randomly generated.

Developer documentation

License

DLite is licensed under the MIT license. However, it include a few third party source files with other permissive licenses. All of these should allow dynamic and static linking against open and propritary codes. A full list of included licenses can be found in LICENSES.txt.

Acknowledgment

In addition from internal funding from SINTEF and NTNU this work has been supported by several projects, including:

  • AMPERE (2015-2020) funded by Forskningsrådet and Norwegian industry partners.
  • FICAL (2015-2020) funded by Forskningsrådet and Norwegian industry partners.
  • SFI Manufacturing (2015-2023) funded by Forskningsrådet and Norwegian industry partners.
  • SFI PhysMet(2020-2028) funded by Forskningsrådet and Norwegian industry partners.
  • OntoTrans (2020-2024) that receives funding from the European Union’s Horizon 2020 Research and Innovation Programme, under Grant Agreement n. 862136.
  • OpenModel (2021-2025) that receives funding from the European Union’s Horizon 2020 Research and Innovation Programme, under Grant Agreement n. 953167.
  • DOME 4.0 (2021-2025) that receives funding from the European Union’s Horizon 2020 Research and Innovation Programme, under Grant Agreement n. 953163.
  • VIPCOAT (2021-2025) that receives funding from the European Union’s Horizon 2020 Research and Innovation Programme, under Grant Agreement n. 952903.

DLite is developed with the hope that it will be a delight to work with.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

DLite-Python-0.3.15.tar.gz (22.9 kB view details)

Uploaded Source

Built Distributions

DLite_Python-0.3.15-cp310-cp310-win_amd64.whl (327.9 kB view details)

Uploaded CPython 3.10 Windows x86-64

DLite_Python-0.3.15-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (15.9 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

DLite_Python-0.3.15-cp39-cp39-win_amd64.whl (328.0 kB view details)

Uploaded CPython 3.9 Windows x86-64

DLite_Python-0.3.15-cp39-cp39-musllinux_1_1_i686.whl (345.0 kB view details)

Uploaded CPython 3.9 musllinux: musl 1.1+ i686

DLite_Python-0.3.15-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (15.9 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

DLite_Python-0.3.15-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl (15.6 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ i686

DLite_Python-0.3.15-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (6.9 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.12+ x86-64

DLite_Python-0.3.15-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl (7.0 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.12+ i686

DLite_Python-0.3.15-cp38-cp38-win_amd64.whl (328.1 kB view details)

Uploaded CPython 3.8 Windows x86-64

DLite_Python-0.3.15-cp38-cp38-musllinux_1_1_i686.whl (344.8 kB view details)

Uploaded CPython 3.8 musllinux: musl 1.1+ i686

DLite_Python-0.3.15-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (15.9 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

DLite_Python-0.3.15-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl (15.6 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ i686

DLite_Python-0.3.15-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (6.9 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ x86-64

DLite_Python-0.3.15-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl (7.0 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ i686

DLite_Python-0.3.15-cp37-cp37m-win_amd64.whl (327.1 kB view details)

Uploaded CPython 3.7m Windows x86-64

DLite_Python-0.3.15-cp37-cp37m-musllinux_1_1_i686.whl (345.2 kB view details)

Uploaded CPython 3.7m musllinux: musl 1.1+ i686

DLite_Python-0.3.15-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (15.9 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

DLite_Python-0.3.15-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl (15.6 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ i686

DLite_Python-0.3.15-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (6.9 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64

DLite_Python-0.3.15-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl (7.0 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ i686

File details

Details for the file DLite-Python-0.3.15.tar.gz.

File metadata

  • Download URL: DLite-Python-0.3.15.tar.gz
  • Upload date:
  • Size: 22.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for DLite-Python-0.3.15.tar.gz
Algorithm Hash digest
SHA256 34cb784b547e434a3b78270ee0facfa53ece4c54663e245a80888fb5d1bb21cd
MD5 4c794302b0adc38fb4d18116c147a5f8
BLAKE2b-256 8796dced6dd3cd5560060c9761573bfadfbf140832b80aa776ed1a5a5a09ff94

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 c8941f770edcd797e4c44b90149c11648a0afd65f65ecc3b0e06e0ee18f91872
MD5 a27398c6c2856dfad89a939ff6ede334
BLAKE2b-256 5732ac43ce2dcb261aa3c16906dbf9432540496fc571c05ce7cbf2ebd2eb2cb5

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 41e2c48d8fec5f3749b5e8480dffd690326966d1f528460a25c0a8e02ada9c92
MD5 d7b453d419dc6b839101be2b81126dcd
BLAKE2b-256 cfd12a78900a43aa8003c9ab8d06876207c211e818273418a6c6085b417b0d19

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 f37ed852bc7935600c3f75f7183d3a5579a9ba5a9f40e2522b7ae416fafb85bb
MD5 576b6ec58750f0cd95af458306aa8db2
BLAKE2b-256 db702c873eaf781694aad11ac39cec9aa440058b7b5604b0edfe3d1837d4477a

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp39-cp39-musllinux_1_1_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp39-cp39-musllinux_1_1_i686.whl
Algorithm Hash digest
SHA256 051686bcbe69879114d2a3af363a9ee452b5b3b20e66a840660d39d718ecd8e4
MD5 997b681c1cbb810e82d57ac7da7c64c8
BLAKE2b-256 4df7d97c610337ace97533684e395a860009184e746dc505899651759c9bcf3c

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 04f1c6056710874bfb44f8c7822bc781bd30197801702523509253d6c6440c78
MD5 13cd2a3e63262d694442d4333331e7da
BLAKE2b-256 ef530997778b496ff08d3f65bc60e774c74cb78c7372baf595b3e6154ad410bd

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 89524322365937bd8139e41c2141d2134968455e2f73ff7e22607478b9db257a
MD5 f318440e9a8521c1e09a032264223311
BLAKE2b-256 1c6a7ea7d46c0fe0a6691379bff37bcf4effd6bef41bc4ceb5baf7f4d8ca015b

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 10120dd777f0796b14e4ac7a7abb3adfb08ddcd1ac94ea9eb5f5af86e1a01e2c
MD5 c48275e674f660508140ffb3848d3d5e
BLAKE2b-256 b40633efd9a51d984f663a72dc5cc0b3de9dac52659e3422cde615a671397efb

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl
Algorithm Hash digest
SHA256 11cefb8db1eb8f420a7928ac042a705932f362b92262eb1815eadcefe6430c2f
MD5 48edf78f8d20002d560709dcc9b2fe08
BLAKE2b-256 a661a357f7be0a2da302c251e68afedd29174f0e2efd204e46e9165c07a700a5

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 16e3823c0769bcf599632c969f926f917561ebb5ef593475b111b0d2ebefa6a1
MD5 83bba59a492ffb8f3ccda0429fe0d78e
BLAKE2b-256 d08f15e1eff156c5d58f5416775428c64cb6e6693aad240d927b54fcb5fcfe69

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp38-cp38-musllinux_1_1_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp38-cp38-musllinux_1_1_i686.whl
Algorithm Hash digest
SHA256 18f31aeeb889b03cb1c86413fae8f811ea9cbb27606808031b46f2e7bbdf9a67
MD5 210bce493c867bbe38f29824861612fe
BLAKE2b-256 6507685e49dc847b057023cdac50e20fcf07e3661f3a4406720f4d9d757b731e

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 382ff8d2a93935acfb2352dd1ac9e0f8ab276df70412e57e788a5c799f93e0a1
MD5 b1463603a0368ce35208f74f288e5ab8
BLAKE2b-256 a8bf84d7f6bad70f4aa9ce9ddafb2553ac1f1212ea4bb70355bac0b1d2854279

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 8caf57b6ae4dcf3d5dfb923abf16e6091f70528ae4aeaea55ea74548726fb553
MD5 282948b6bd1a94d22048fd5cc5234f8c
BLAKE2b-256 5ea96d2142b83f880903ef98f6252e5d647f9f6bc98b5d081731157a0f70deba

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 0c66f372fb5849b50ef684dc3694926426c0d188c6cb555e9cb7cd13c9af5920
MD5 19d318043be5f04ffafd89c721f757bd
BLAKE2b-256 1dc1e941c98ab1b9a60b8e65388b24e9788c5d669802f377173be2c9af1a2990

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl
Algorithm Hash digest
SHA256 160553dca34779329f391129c5764b7eef9c3288834df54e79a9540ac33cac55
MD5 168b34409f88a727bbe19bf27984667c
BLAKE2b-256 398ad526816d1c9f9bb7be6246f8f59b648a114fef1f4c15b58b925552c2654c

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp37-cp37m-win_amd64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 5ddb7b2d685e0b0b1625240ca2ca094bf3bee9387da106bfc94dfaae1e653308
MD5 6118b5550c0e3484156412b391110f84
BLAKE2b-256 3c4660ad34f0d4f4942fb659320f89dff760027b5e3cb87eb0a244350c98a792

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp37-cp37m-musllinux_1_1_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp37-cp37m-musllinux_1_1_i686.whl
Algorithm Hash digest
SHA256 74966730f188869a62e739cbf1bdcfc75e30d500c520397ad2c1f64d0c6f7a42
MD5 00db3bd47c5a370c8c2806cc95a1168f
BLAKE2b-256 01a81bd38c7d176208a17810b35fb5f43f7fff87ff142dfb1f4c76bd9ec5bd58

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 9c5f23fa34e7f3ff7b5f412b29ad997191997b7536c58fc211269cdda1f6fa0c
MD5 9c03cd66999da5e63d7a0500a4f18d1e
BLAKE2b-256 008c487b3eb3a61d3ddcc830ae7d5bda8dbf76d19039d543de0d9bfdcded51c3

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 fa96b1303d27d08e2819d5d4c542882725dd4152c63397f6c34f8cbace0ffff7
MD5 62cee69d4b85e227ab512227c794ef51
BLAKE2b-256 e10a3bc4666f48a8705e19723326037563cebfa491735b9d4aa840bbf8dec106

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 2df4ecbd56691a59ab73e3c96d4e6dbced7a5d5fd7160122a5784931df93db37
MD5 4dba846fd0bc3d53aa2bd4e3f3d5a0cc
BLAKE2b-256 df95deb7061640363e9dff6f1219f47b624f9dcd57a4bcad2c275b09e3244328

See more details on using hashes here.

File details

Details for the file DLite_Python-0.3.15-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl.

File metadata

File hashes

Hashes for DLite_Python-0.3.15-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl
Algorithm Hash digest
SHA256 4d4c21175b29ae1a755db08b4fa5f9cd08939ab61036b90df071f2f720af2ab3
MD5 1fa8415b5ddee3fca3da10a076db5e19
BLAKE2b-256 8af6d840c8ec245a8d3ea47ea8edbbc35a968932c976b57c98caebf8ca20c50b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page