Submit existing Decision Optimization instances to WML
Project description
dowml
A library and command line client to use Decision Optimization on WML
Note that this tool is not an official IBM product, and is not supported by IBM. This is educational material, use at your own risks.
tldr;
$ pip install dowml
$ cat my_credentials.txt
{
'apikey': '<apikey>',
'region': 'us-south',
'cos_resource_crn': 'crn:v1:bluemix:public:cloud-object-storage:global:a/76260f9...',
'ml_instance_crn': 'crn:v1:bluemix:public:pm-20:eu-de:a/76260f...'
}
$ dowml -w my-credentials.txt
dowml> solve examples/afiro.mps
dowml> wait
dowml> log
dowml> exit
Introduction
The class DOWMLLib
provides an API to upload Decision Optimization models (CPLEX, CP Optimizer, OPL or docplex) to WML, check their status, and download results. The script dowml.py
is an interactive program on top of that library.
In order to use either of them, you need to provide IBM Cloud credentials.
- By default,
DOWMLLib
(and therefore the Interactive) look for these credentials in an environment variable namedDOWML_CREDENTIALS
. This variable shoud have a value looking like
{
'apikey': '<apikey>',
'region': 'us-south',
'cos_resource_crn': 'crn:v1:bluemix:public:cloud-object-storage:global:a/76260f9...',
'ml_instance_crn': 'crn:v1:bluemix:public:pm-20:eu-de:a/76260f...',
}
See below for how/where to get these credentials.
- As an alternative, you can specify a file name as argument to
DOWMLLib.__init__
. The credentials will then be read from that file instead of the environment variable. Accordingly, the Interactive has a command line option-w
(or--wml-cred-file
) that must be followed by the path of the file. - Finally, if none of the above options are used, the code will look for
environment variable
DOWML_CREDENTIALS_FILE
. If it exists, it must be the path to a file that contains credentials such as the ones above.
Here's a sample session:
$ dowml -h
usage: interactive.py [-h] [--wml-cred-file WML_CRED_FILE] [--verbose]
[--commands [COMMANDS [COMMANDS ...]]] [--input] [--space SPACE] [--url URL] [--api-key API_KEY] [--region REGION]Decision Optimization in WML Interactive, version 1.9.0.Submit and manage Decision Optimization models interactively.(c) Copyright Xavier Nodet, 2022
optional arguments:
-h, --help show this help message and exit
--wml-cred-file WML_CRED_FILE, -w WML_CRED_FILE
Name of the file from which to read WML credentials. If not specified,
credentials are read from environment variable $DOWML_CREDENTIALS. If no such
variable exists, but variable $DOWML_CREDENTIALS_FILE exists, tries to read
that file.
--verbose, -v Verbose mode. Causes the program to print debugging messages about its
progress. Multiple -v options increase the verbosity. The maximum is 4.
--commands [COMMANDS [COMMANDS ...]], -c [COMMANDS [COMMANDS ...]]
Carries out the specified commands. Each command is executed as if it had been
specified at the prompt. The program stops after last command, unless --input
is used.
--input, -i Prompts for new input commands even if some commands have been specified as
arguments using --commands.
--space SPACE, -s SPACE
Id of the space to connect to. Takes precedence over the one specified in the
credentials under the 'space_id' key, if any.
--url URL, -u URL URL to use for the Machine Learning service. Takes precedence over the one
specified in the credentials under the 'url' key, if any. Incompatible with
--region argument.
--api-key API_KEY, -k API_KEY
API key to use to connect to WML. Takes precedence over the one specified in
the credentials under the 'apikey' key, if any.
--region REGION, -r REGION
Region to use for the Machine Learning service. Takes precedence over the
region or URL specified in the credentials, if any. Incompatible with --url
argument. Possible values for the region are ['us-south', 'eu-de', 'eu-gb',
'jp-tok'].
$
$
$ dowml -c help type size 'inputs inline' 'solve examples/afiro.mps' jobs wait jobs log 'type docplex' 'solve examples/markshare.py examples/markshare1.mps.gz' wait jobs dump 'shell ls -l *-*-*-*-*' 'delete *'
Decision Optimization in WML Interactive, version 1.9.0.
Submit and manage Decision Optimization models interactively.
(c) Copyright Xavier Nodet, 2022
Type ? for a list of commands.
Most commands need an argument that can be either a job id, or the number
of the job, as displayed by the 'jobs' command. If a command requires a
job id, but none is specified, the last one is used.
dowml> help
Documented commands (type help <topic>):
========================================
cancel details exit inline jobs output shell solve time version
delete dump help inputs log outputs size status type wait
dowml> type
Current model type: cplex.
Known types: cplex, cpo, opl, docplex.
dowml> size
Current size: S.
Known sizes: S, M, L, XL.
dowml> inputs inline
dowml> solve examples/afiro.mps
Job id: cd494377-4843-40a4-ae84-ede7f8c16eda
dowml> jobs
# status id creation date type ver. size inputs
=> 1: queued cd494377-4843-40a4-ae84-ede7f8c16eda 2022-06-28 11:59:06 cplex 22.1 S afiro.mps
dowml> wait
Job is running.
Job is completed.
Job has finished with status 'completed'.
dowml> jobs
# status id creation date type ver. size inputs
=> 1: completed cd494377-4843-40a4-ae84-ede7f8c16eda 2022-06-28 11:59:06 cplex 22.1 S afiro.mps
dowml> log
[2022-06-28T09:59:08Z, INFO] CPLEX version 22010000
[2022-06-28T09:59:08Z, WARNING] Changed parameter CPX_PARAM_THREADS from 0 to 1
[2022-06-28T09:59:08Z, INFO] Param[1,067] = 1
[2022-06-28T09:59:08Z, INFO] Param[1,130] = UTF-8
[2022-06-28T09:59:08Z, INFO] Param[1,132] = -1
[2022-06-28T09:59:08Z, INFO]
[2022-06-28T09:59:08Z, INFO] Selected objective sense: MINIMIZE
[2022-06-28T09:59:08Z, INFO] Selected objective name: obj
[2022-06-28T09:59:08Z, INFO] Selected RHS name: rhs
[2022-06-28T09:59:08Z, INFO] Version identifier: 22.1.0.0 | 2022-03-30 | 54982fbec
[2022-06-28T09:59:08Z, INFO] CPXPARAM_Threads 1
[2022-06-28T09:59:08Z, INFO] CPXPARAM_Output_CloneLog -1
[2022-06-28T09:59:08Z, INFO] CPXPARAM_Read_APIEncoding "UTF-8"
[2022-06-28T09:59:08Z, INFO] Tried aggregator 1 time.
[2022-06-28T09:59:08Z, INFO] LP Presolve eliminated 9 rows and 10 columns.
[2022-06-28T09:59:08Z, INFO] Aggregator did 7 substitutions.
[2022-06-28T09:59:08Z, INFO] Reduced LP has 11 rows, 15 columns, and 37 nonzeros.
[2022-06-28T09:59:08Z, INFO] Presolve time = 0.00 sec. (0.03 ticks)
[2022-06-28T09:59:08Z, INFO]
[2022-06-28T09:59:08Z, INFO] Iteration log . . .
[2022-06-28T09:59:08Z, INFO] Iteration: 1 Scaled dual infeas = 1.200000
[2022-06-28T09:59:08Z, INFO] Iteration: 5 Dual objective = -464.753143
[2022-06-28T09:59:09Z, INFO] There are no bound infeasibilities.
[2022-06-28T09:59:09Z, INFO] There are no reduced-cost infeasibilities.
[2022-06-28T09:59:09Z, INFO] Max. unscaled (scaled) Ax-b resid. = 1.77636e-14 (1.77636e-14)
[2022-06-28T09:59:09Z, INFO] Max. unscaled (scaled) c-B'pi resid. = 5.55112e-17 (5.55112e-17)
[2022-06-28T09:59:09Z, INFO] Max. unscaled (scaled) |x| = 500 (500)
[2022-06-28T09:59:09Z, INFO] Max. unscaled (scaled) |slack| = 500 (500)
[2022-06-28T09:59:09Z, INFO] Max. unscaled (scaled) |pi| = 0.942857 (1.88571)
[2022-06-28T09:59:09Z, INFO] Max. unscaled (scaled) |red-cost| = 10 (10)
[2022-06-28T09:59:09Z, INFO] Condition number of scaled basis = 1.5e+01
[2022-06-28T09:59:09Z, INFO] optimal (1)
dowml> type docplex
dowml> solve examples/markshare.py examples/markshare1.mps.gz
Job id: 6520e72b-727c-4bfe-adb5-a40d96cf5910
dowml> wait
Job is queued.
Job is running..
[2022-06-28T09:59:16Z, WARNING] Python 3.9 is used as default with pandas 1.3 libraries.
[2022-06-28T09:59:17Z, INFO] Reading markshare1.mps.gz...
.
[2022-06-28T09:59:16Z, WARNING] Python 3.9 is used as default with pandas 1.3 libraries.
[2022-06-28T09:59:17Z, INFO] Reading markshare1.mps.gz...
.
[2022-06-28T09:59:16Z, WARNING] Python 3.9 is used as default with pandas 1.3 libraries.
[2022-06-28T09:59:17Z, INFO] Reading markshare1.mps.gz...
.
[2022-06-28T09:59:17Z, INFO]
[2022-06-28T09:59:17Z, INFO] Nodes Cuts/
[2022-06-28T09:59:17Z, INFO] Node Left Objective IInf Best Integer Best Bound ItCnt Gap
[2022-06-28T09:59:17Z, INFO]
[2022-06-28T09:59:17Z, INFO] * 0+ 0 7286.0000 0.0000 100.00%
[2022-06-28T09:59:17Z, INFO] 0 0 0.0000 6 7286.0000 0.0000 11 100.00%
[2022-06-28T09:59:17Z, INFO] * 0+ 0 263.0000 0.0000 100.00%
[2022-06-28T09:59:17Z, INFO] * 0+ 0 230.0000 0.0000 100.00%
[2022-06-28T09:59:17Z, INFO] 0 0 0.0000 7 230.0000 Cuts: 15 15 100.00%
[2022-06-28T09:59:17Z, INFO] 0 0 0.0000 7 230.0000 Cuts: 16 23 100.00%
[2022-06-28T09:59:17Z, INFO] * 0+ 0 193.0000 0.0000 100.00%
[2022-06-28T09:59:17Z, INFO] Detecting symmetries...
[2022-06-28T09:59:17Z, INFO] 0 2 0.0000 7 193.0000 0.0000 23 100.00%
[2022-06-28T09:59:17Z, INFO] Elapsed time = 0.01 sec. (2.91 ticks, tree = 0.01 MB, solutions = 4)
[2022-06-28T09:59:17Z, INFO] * 70+ 59 166.0000 0.0000 100.00%
[2022-06-28T09:59:17Z, INFO] * 80+ 67 132.0000 0.0000 100.00%
[2022-06-28T09:59:17Z, INFO] * 190+ 155 111.0000 0.0000 100.00%
[2022-06-28T09:59:17Z, INFO] * 220+ 166 96.0000 0.0000 100.00%
[2022-06-28T09:59:17Z, INFO] * 320+ 240 71.0000 0.0000 100.00%
[2022-06-28T09:59:17Z, INFO] * 420+ 305 67.0000 0.0000 100.00%
[2022-06-28T09:59:17Z, INFO] * 420+ 303 66.0000 0.0000 100.00%
[2022-06-28T09:59:17Z, INFO] * 491 310 integral 0 38.0000 0.0000 1112 100.00%
[2022-06-28T09:59:17Z, INFO]
[2022-06-28T09:59:17Z, INFO] Performing restart 1
[2022-06-28T09:59:17Z, INFO]
[2022-06-28T09:59:17Z, INFO] Repeating presolve.
[2022-06-28T09:59:17Z, INFO] Tried aggregator 1 time.
[2022-06-28T09:59:17Z, INFO] Reduced MIP has 6 rows, 56 columns, and 306 nonzeros.
[2022-06-28T09:59:17Z, INFO] Reduced MIP has 50 binaries, 6 generals, 0 SOSs, and 0 indicators.
[2022-06-28T09:59:17Z, INFO] Presolve time = 0.00 sec. (0.14 ticks)
[2022-06-28T09:59:17Z, INFO] Tried aggregator 1 time.
[2022-06-28T09:59:17Z, INFO] Reduced MIP has 6 rows, 56 columns, and 306 nonzeros.
[2022-06-28T09:59:17Z, INFO] Reduced MIP has 50 binaries, 6 generals, 0 SOSs, and 0 indicators.
[2022-06-28T09:59:17Z, INFO] Presolve time = 0.00 sec. (0.19 ticks)
[2022-06-28T09:59:17Z, INFO] Represolve time = 0.00 sec. (0.81 ticks)
[2022-06-28T09:59:17Z, INFO] 1518 0 0.0000 7 38.0000 Cuts: 17 3422 100.00%
[2022-06-28T09:59:17Z, INFO] 1518 0 0.0000 8 38.0000 Cuts: 17 3429 100.00%
[2022-06-28T09:59:17Z, INFO] 1518 0 0.0000 7 38.0000 Cuts: 14 3436 100.00%
[2022-06-28T09:59:17Z, INFO] 1518 0 0.0000 7 38.0000 Cuts: 14 3441 100.00%
[2022-06-28T09:59:18Z, INFO] 3918 1669 0.0000 6 38.0000 0.0000 8018 100.00%
[2022-06-28T09:59:18Z, INFO] 6508 2914 0.0000 6 38.0000 0.0000 14256 100.00%
[2022-06-28T09:59:19Z, INFO] 9718 4470 0.0000 6 38.0000 0.0000 22692 100.00%
[2022-06-28T09:59:19Z, INFO] Began writing nodes to disk (directory ./cpxY2cKqj created)
[2022-06-28T09:59:22Z, INFO] 10638 4956 13.6062 6 38.0000 0.0000 25327 100.00%
[2022-06-28T09:59:23Z, INFO] 13478 6155 0.0000 6 38.0000 0.0000 33717 100.00%
Job is completed.
Job has finished with status 'completed'.
dowml> jobs
# status id creation date type ver. size inputs
1: completed cd494377-4843-40a4-ae84-ede7f8c16eda 2022-06-28 11:59:06 cplex 22.1 S afiro.mps
=> 2: completed 6520e72b-727c-4bfe-adb5-a40d96cf5910 2022-06-28 11:59:13 docplex 22.1 S markshare.py, markshare1.mps.gz
dowml> dump
Storing 6520e72b-727c-4bfe-adb5-a40d96cf5910/details.json
Storing 6520e72b-727c-4bfe-adb5-a40d96cf5910/markshare.py
Storing 6520e72b-727c-4bfe-adb5-a40d96cf5910/markshare1.mps.gz
Storing 6520e72b-727c-4bfe-adb5-a40d96cf5910/model.lp
Storing 6520e72b-727c-4bfe-adb5-a40d96cf5910/solution.json
Storing 6520e72b-727c-4bfe-adb5-a40d96cf5910/kpis.csv
Storing 6520e72b-727c-4bfe-adb5-a40d96cf5910/stats.csv
Storing 6520e72b-727c-4bfe-adb5-a40d96cf5910/log.txt
dowml> shell ls -l *-*-*-*-*
total 88
-rw-rw-r-- 1 nodet staff 5506 Jun 28 11:59 details.json
-rw-rw-r-- 1 nodet staff 37 Jun 28 11:59 kpis.csv
-rw-rw-r-- 1 nodet staff 7299 Jun 28 11:59 log.txt
-rw-rw-r-- 1 nodet staff 671 Jun 28 11:59 markshare.py
-rw-rw-r-- 1 nodet staff 1607 Jun 28 11:59 markshare1.mps.gz
-rw-rw-r-- 1 nodet staff 4197 Jun 28 11:59 model.lp
-rw-rw-r-- 1 nodet staff 1769 Jun 28 11:59 solution.json
-rw-rw-r-- 1 nodet staff 344 Jun 28 11:59 stats.csv
dowml> delete *
WML credentials
The DOWML client requires some information in order to connect to the Watson Machine Learning service. Two pieces of information are required, and the others are optional.
Required items
-
The
apikey
is a secret that identifies the IBM Cloud user. One typically creates one key per application or service, in order to be able to revoke them individually if needed. To generate such a key, open https://cloud.ibm.com/iam/apikeys, and click the blue 'Create an IBM Cloud API key' on the right. -
The
url
is the base URL for the REST calls to WML. The possible values are found in https://cloud.ibm.com/apidocs/machine-learning#endpoint-url, and depend on which region you want to use. -
As an alternative to the
url
value, you can use a more user-friendly and easier to rememberregion
, with a value that is eitherus-south
,eu-de
,eu-gb
orjp-tok
. From this value, dowml will deduce the correct URL to use. To avoid ambiguities or duplications, it is not allowed to use bothurl
andregion
.
Optional items
Watson Studio and Watson Machine Learning use spaces to group together, and isolate from each other, the assets that belong to a single project. These assets include the data files submitted, the results of the jobs, and the deployments (software and hardware configurations) that run these jobs.
The DOWML client will connect to the space specified by the user using
either the --space
command-line argument or the space_id
item in the credentials.
If neither of these are specified, the client will look for a space named
dowml-space, and will try to create such a space if one doesn't exist.
To create a new space, the DOWML client will need both cos_resource_crn
and
ml_instance_crn
to have been specified in the credentials.
-
space_id
: identifier of an existing space to connect to. Navigate to the 'Spaces' tab of your Watson Studio site (e.g. https://eu-de.dataplatform.cloud.ibm.com/ml-runtime/spaces if you are using the instance in Germany), right-click on the name of an existing space to copy the link. The id of the space is the string of numbers, letters and dashes between the last/
and the?
. -
cos_resource_crn
: WML needs to store some data in a Cloud Object Storage instance. Open https://cloud.ibm.com/resources and locate the 'Storage' section. Create an instance of the Cloud Object Storage service if needed. Once it's listed on the resource page, click anywhere on the line for that service, except on its name. This will open a pane on the right which lists the CRN. Click on the symbol at the right to copy this information. This item is required only for the DOWML client to be able to create a space. If you specified aspace_id
, it is not required. -
ml_instance_crn
: similarly, you need to identify an instance of Machine Learning service to use to solve your jobs. In the same page https://cloud.ibm.com/resources, open the 'Services' section. The 'Product' columns tells you the type of service. If you don't have a 'Machine Learning' instance already, create one. Then click on the corresponding line anywhere except on the name, and copy the CRN displayed in the pane that open on the right. This item is required only for the DOWML client to be able to create a space. If you specified aspace_id
, it is not required.
CP4D credentials
The credentials to connect to the WML service in a (private) CP4D instance are different from those above that pertain to CP4D as a service. The credentials look like this:
{
"instance_id": "openshift",
"version": "4.0",
"url": "...",
"username": "...",
"apikey": "...",
"space_id": "..."
}
- The
url
is the URL of your CP4D instance, with no/
at the end. - The
username
andapikey
for your user on this cluster. You can get the API key in the 'Profile and settings' dialog that's accessible from your avatar menu in the top-right of the screen. space_id
: the identifier of an existing space to connect to. The DOWML client will connect to the space specified by the user using either the--space
command-line argument or thespace_id
item in the credentials. If neither of these are specified, the client will look for a space named dowml-space, and will try to create such a space if one doesn't exist. On CPD, unlike for Cloud, nocos_resource_crn
orml_instance_crn
are required.
Using data assets in Watson Studio
The DOWML library has two modes of operation with respect to sending the models
to the WML service: inline data, or using data assets in Watson Studio. By default,
data assets are used for inputs, while inline data is used for outputs.
This can be changed with the inputs
and outputs
commands.
With inline data, the model is sent directly to the WML service in the solve
request itself, and the output is part of the job details that are downloaded when
asking for information about the job.
This is the simplest, but it has a number of drawbacks:
-
Sending a large model may take a long time, because of network throughput. Sending a very large REST request is not at all guaranteed to succeed. Similarly, if your job has large outputs, the job may fail while trying to process them.
-
When solving several times the same model (e.g. to evaluate different parameters), the model has to be sent each time.
-
In order to display the names of the files that were sent, the jobs command needs to request this information, and it comes with the content of the files themselves. In other words, every jobs command requires downloading the content of all the inline data files for all the jobs that exist in the space.
Using data assets in Watson Studio as an intermediate step alleviate all these issues:
-
Once the model has been uploaded to Watson Studio, it will be reused for subsequent jobs without the need to upload it again.
-
The job requests refer to the files indirectly, via URLs. Therefore, they don't take much space, and listing the jobs doesn't imply to download the content of the files.
-
Uploading to Watson Studio is done through specialized code that doesn't just send a single request. Rather, it divides the upload in multiple reasonably sized chunks that each are uploaded individually, with restart if necessary. Uploading big files is therefore much less prone to failure.
User-visible changes
V1.9.0:
- [#60] 'dump' was creating invalid files for inline content
- [#58] Sort list of available versions
- [#59] Upgrade default version to 22.1
V1.8.0:
- [#57] 'cancel' accepts job specifications such as * and n-m
- [#55] We can't delete the Watson Studio job when connected to a CPD instance. But at least we can stop warning about that...
- [#53] Store the details first when dumping a job, in case there's an error later in the process
- [#56] Get dowml to work with on-prem CPD instances through tokens
V1.7.0:
- [#51] Add 'L' to the list of known sizes
- [#50] Don't store the log on disk when downloading it for display
- [#49] Enable saving one REST call when Python API is recent enough
V1.6.1:
- [#48] Parameter '--api-key' must override whatever was found through the environment
V1.6.0:
- [#47] Add a 'api-key' parameter to the Interactive
- [#40] Introduce cancel_job function. Deprecate hard parameter in call to delete_job, which now really deletes the job by default.
V1.5.0:
- [#45] Add readonly 'url' and 'space_id' attributes to the lib
- [#44] Speed-up job creation by caching deployment information
- [#43] Add 'status' command in the Interactive
- [#42] Rename submodule 'dowmllib' to simply 'lib'
V1.4.1:
- [#41] Prevent crash when using '--region' and credentials didn't include a URL.
- [#39] When creating the space, wait for it to be fully ready instead of (trying to) use it immediately.
V1.4.0:
- [#37] Allow to override the URL in the credentials with a command-line argument (in the Interactive) or a constructor argument (in the library). Also, allows to specify a region instead of a URL (for the known regions).
- [#36] Don't leave the Watson Studio runs dangling when deleting a job.
V1.3.1:
- Update the documentation in README.md
- Update the sample session in README.md
V1.3.0:
- [#34] 'dump' downloads and stores all the inputs and outputs of a job, so that all the data is readily available. Replaces 'output', which is now deprecated.
- [#33] Print status of job while waiting for completion.
V1.2.0:
- [#32] 'output' downloads the output data assets, not just inline outputs.
- [#31] The 'log' command wouldn't work if the job used 'outputs assets'. Fix that.
V1.1.1:
- [#27] Catch timeout errors in the Interactive so that the session is not interrupted.
V1.1.0:
- [#3] Accept 'delete 2-5' to delete a range of jobs.
- [#25] Replace 'inline' command (resp. attribute) in the Interactive (resp. library) with 'inputs'. Deprecate 'inline'.
- [#4] Introduce 'outputs' command to change the type of outputs from inline to data-assets.
- [#21] Read credentials from file $DOWML_CREDENTIALS_FILE as last resort.
V1.0.0:
- Packaging-only changes
V0.9.0, first release on PyPi:
- [#17] DOWMLLib now returns tabular outputs as dataframes by default. Also replace the now-deprecated csv_as_dataframe with tabular_as_csv
- [#18] DOWMLLib.get_output returns a dict instead of a list
- [#16] 'output' stores files in subdirectories
- [#12] Add 'shell' command in the Interactive
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file dowml-1.9.0.tar.gz
.
File metadata
- Download URL: dowml-1.9.0.tar.gz
- Upload date:
- Size: 43.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.8.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | bbf0beea54f913d39af2016bd66f0b1be51c7a856f86ea950afca1b710712d13 |
|
MD5 | 5be8393094b20258fbd329a7cb89face |
|
BLAKE2b-256 | 107b5df80f02354702dedea61cb29b925e7bc6bb27d4adc2562a420d3e3c0067 |
File details
Details for the file dowml-1.9.0-py3-none-any.whl
.
File metadata
- Download URL: dowml-1.9.0-py3-none-any.whl
- Upload date:
- Size: 36.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.8.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 55ca69951275707fba3e04d962e227961babccadbf0449d0eb61c1a142cd6caf |
|
MD5 | 6d54f0fe53722709de9f38ced0ebc4b8 |
|
BLAKE2b-256 | db0af1e027141fae71261f0f91f94b39249eb77a1c4e38cc743a604b54c2f049 |