Skip to main content

Client interface for twinLab machine-learning in the cloud.

Project description

twinLab Client

digiLab slack

Headless interface to the twinLab library.

Installation

Most users should use pip

pip install twinlab

If you want to modify the client-side code, or have a local installation, you will need to have git, poetry, and a python version of 3.10 or higher installed. Then you can do:

git clone https://github.com/digiLab-ai/twinLab-client.git
cd twinlab-client
poetry install

Environment setup

You will need a .env file in your project directory that looks like the .env.example file in this repository

cp .env.example .env

and fill in your twinLab user details.

Commands

Pipeline

poetry run python scripts/twinlab/test.py

where test.py can be replaced with any of the scripts in the scripts directory.

Individual examples

Get user information:

poetry run python scripts/twinlab/get_user_information.py

Get version information:

poetry run python scripts/twinlab/get_versions.py

List datasets:

poetry run python scripts/twinlab/list_datasets.py

Upload dataset to the Cloud:

Fill in the arguments (between the angled brackets; < >) with your own dataset and dataset_id (this the filename for the dataset when stored in the Cloud):

poetry run python scripts/twinlab/upload_dataset.py <path/to/dataset.csv> <dataset_id>

For example, using the test biscuits dataset:

poetry run python scripts/twinlab/upload_dataset.py resources/datasets/biscuits.csv biscuits

View dataset that has been uploaded to the Cloud:

poetry run python scripts/twinlab/view_dataset.py <dataset_id>
poetry run python scripts/twinlab/view_dataset.py biscuits

Summarise a dataset on the Cloud:

poetry run python scripts/twinlab/query_dataset.py <dataset_id>
poetry run python scripts/twinlab/query_dataset.py biscuits

List campaigns that you have uploaded to the Cloud:

poetry run python scripts/twinlab/list_campaigns.py

Train campaign on the Cloud:

poetry run python scripts/twinlab/train_campaign.py <path/to/parameters.json> <campaign_id> <processor>
poetry run python scripts/twinlab/train_campaign.py resources/campaigns/biscuits/params.json biscuits-campaign

View campaign details:

poetry run python scripts/twinlab/view_campaign.py <campaign_id>
poetry run python scripts/twinlab/view_campaign.py biscuits-campaign

Summarise trained campaign:

poetry run python scripts/twinlab/query_campaign.py <campaign_id>
poetry run python scripts/twinlab/query_campaign.py biscuits-campaign

Predict using a trained campaign:

poetry run python scripts/twinlab/predict_campaign.py <path/to/inputs.csv> <campaign_id> <method> <processor>
poetry run python scripts/twinlab/predict_campaign.py resources/campaigns/biscuits/eval.csv biscuits-campaign

Delete a campaign from the Cloud:

poetry run python scripts/twinlab/delete_campaign.py <campaign_id>
poetry run python scripts/twinlab/delete_campaign.py biscuits-campaign

Delete a dataset from the Cloud:

poetry run python scripts/twinlab/delete_dataset.py <dataset_id>
poetry run python scripts/twinlab/delete_dataset.py biscuits

Full example

Here we create some mock data (which has a quadratic relationship between X and y) and use twinLab to create a surrogate model with quantified uncertainty.

# Import libraries
import twinlab as tl
import pandas as pd

# Create a dataset and upload to twinLab cloud
df = pd.DataFrame({"X": [1, 2, 3, 4], "y": [1, 4, 9, 16]})
tl.upload_dataset(df, "test-data")

# Train a machine-learning model for the data
params = {
    "dataset_id": "test-data",
    "inputs": ["X"],
    "outputs": ["y"],
}
tl.train_campaign(params, campaign_id="test-model")

# Evaluate the model on some unseen data
df = pd.DataFrame({"X": [1.5, 2.5, 3.5]})
df_mean, df_std = tl.predict_campaign(df, campaign_id="test-model")

Notebooks

Check out the notebooks directory for some additional examples to get started!

Documentation

See the live documentation at https://digilab-ai.github.io/twinLab-client/. Or build a copy locally:

cd docs
yarn install && yarn start

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

twinlab-1.3.0.tar.gz (10.8 kB view details)

Uploaded Source

Built Distribution

twinlab-1.3.0-py3-none-any.whl (11.5 kB view details)

Uploaded Python 3

File details

Details for the file twinlab-1.3.0.tar.gz.

File metadata

  • Download URL: twinlab-1.3.0.tar.gz
  • Upload date:
  • Size: 10.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.11.3 Darwin/22.5.0

File hashes

Hashes for twinlab-1.3.0.tar.gz
Algorithm Hash digest
SHA256 c77438ddd965667afa6b5bdf1ad83f7555737dad363ea0e84fc8249e7d5d61f4
MD5 42b3fd1a217191021b66a54d634d2e96
BLAKE2b-256 a0b7683dca69ee4a96169fb3cce76e35924aac30f9abd0b5c651cb8339045c4f

See more details on using hashes here.

File details

Details for the file twinlab-1.3.0-py3-none-any.whl.

File metadata

  • Download URL: twinlab-1.3.0-py3-none-any.whl
  • Upload date:
  • Size: 11.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.11.3 Darwin/22.5.0

File hashes

Hashes for twinlab-1.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fa8643fe4548f30dc7d6350493eb36335f682c0ba2da814507f7a31d069b196f
MD5 a256e29c05c8c9bf21fc0d9a726d0d3a
BLAKE2b-256 338e34f1a838f300b4bb11e0f2cac548802ee988488012269c1c25a18acb9ae0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page