Skip to main content

Client interface for twinLab machine-learning in the cloud.

Project description

twinLab Client

digiLab slack

Headless interface to the twinLab library.

Installation

Most users should use pip

pip install twinlab

If you want to modify the client-side code, or have a local installation, you will need to have git, poetry, and a python version of 3.10 or higher installed. Then you can do:

git clone https://github.com/digiLab-ai/twinLab-client.git
cd twinlab-client
poetry install

Environment setup

You will need a .env file in your project directory that looks like the .env.example file in this repository

cp .env.example .env

and fill in your twinLab user details.

Commands

Pipeline

poetry run python scripts/twinlab/test.py

where test.py can be replaced with any of the scripts in the scripts directory.

Individual examples

Get user information:

poetry run python scripts/twinlab/get_user_information.py

Get version information:

poetry run python scripts/twinlab/get_versions.py

List datasets:

poetry run python scripts/twinlab/list_datasets.py

Upload dataset to the Cloud:

Fill in the arguments (between the angled brackets; < >) with your own dataset and dataset_id (this the filename for the dataset when stored in the Cloud):

poetry run python scripts/twinlab/upload_dataset.py <path/to/dataset.csv> <dataset_id>

For example, using the test biscuits dataset:

poetry run python scripts/twinlab/upload_dataset.py resources/datasets/biscuits.csv biscuits

View dataset that has been uploaded to the Cloud:

poetry run python scripts/twinlab/view_dataset.py <dataset_id>
poetry run python scripts/twinlab/view_dataset.py biscuits

Summarise a dataset on the Cloud:

poetry run python scripts/twinlab/query_dataset.py <dataset_id>
poetry run python scripts/twinlab/query_dataset.py biscuits

List campaigns that you have uploaded to the Cloud:

poetry run python scripts/twinlab/list_campaigns.py

Train campaign on the Cloud:

poetry run python scripts/twinlab/train_campaign.py <path/to/parameters.json> <campaign_id> <processor>
poetry run python scripts/twinlab/train_campaign.py resources/campaigns/biscuits/params.json biscuits-campaign

View campaign details:

poetry run python scripts/twinlab/view_campaign.py <campaign_id>
poetry run python scripts/twinlab/view_campaign.py biscuits-campaign

Summarise trained campaign:

poetry run python scripts/twinlab/query_campaign.py <campaign_id>
poetry run python scripts/twinlab/query_campaign.py biscuits-campaign

Predict using a trained campaign:

poetry run python scripts/twinlab/predict_campaign.py <path/to/inputs.csv> <campaign_id> <method> <processor>
poetry run python scripts/twinlab/predict_campaign.py resources/campaigns/biscuits/eval.csv biscuits-campaign

Delete a campaign from the Cloud:

poetry run python scripts/twinlab/delete_campaign.py <campaign_id>
poetry run python scripts/twinlab/delete_campaign.py biscuits-campaign

Delete a dataset from the Cloud:

poetry run python scripts/twinlab/delete_dataset.py <dataset_id>
poetry run python scripts/twinlab/delete_dataset.py biscuits

Full example

Here we create some mock data (which has a quadratic relationship between X and y) and use twinLab to create a surrogate model with quantified uncertainty.

# Import libraries
import twinlab as tl
import pandas as pd

# Create a dataset and upload to twinLab cloud
df = pd.DataFrame({"X": [1, 2, 3, 4], "y": [1, 4, 9, 16]})
tl.upload_dataset(df, "test-data")

# Train a machine-learning model for the data
params = {
    "dataset_id": "test-data",
    "inputs": ["X"],
    "outputs": ["y"],
}
tl.train_campaign(params, campaign_id="test-model")

# Evaluate the model on some unseen data
df = pd.DataFrame({"X": [1.5, 2.5, 3.5]})
df_mean, df_std = tl.predict_campaign(df, campaign_id="test-model")

Notebooks

Check out the notebooks directory for some additional examples to get started!

Documentation

See the live documentation at https://digilab-ai.github.io/twinLab-client/. Or build a copy locally:

cd docs
yarn install && yarn start

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

twinlab-1.4.0.tar.gz (11.2 kB view details)

Uploaded Source

Built Distribution

twinlab-1.4.0-py3-none-any.whl (11.8 kB view details)

Uploaded Python 3

File details

Details for the file twinlab-1.4.0.tar.gz.

File metadata

  • Download URL: twinlab-1.4.0.tar.gz
  • Upload date:
  • Size: 11.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.6.1 CPython/3.11.5 Darwin/22.6.0

File hashes

Hashes for twinlab-1.4.0.tar.gz
Algorithm Hash digest
SHA256 62a7db106e10a47b91447e226c30a9e9fca224b6257cacaa53125a336e5eb830
MD5 5755b6eb6652530eb251fe5f596f5e73
BLAKE2b-256 7cb2575c46e75e1b1552f31656b2b7485d920229fa3d356bc68a4d352a7aee97

See more details on using hashes here.

File details

Details for the file twinlab-1.4.0-py3-none-any.whl.

File metadata

  • Download URL: twinlab-1.4.0-py3-none-any.whl
  • Upload date:
  • Size: 11.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.6.1 CPython/3.11.5 Darwin/22.6.0

File hashes

Hashes for twinlab-1.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4715f19d344a757af644b3f37d7cce6bc83ee03bc39178435af776f7287b3db1
MD5 c6484cbaf9faaeaaef69dd21511561a5
BLAKE2b-256 8f4a86fbb8cd4f5d5c42c5df8ca8aca62bb403a1a7b55871bf592996a721bd72

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page