Skip to main content

Client interface for twinLab machine-learning in the cloud.

Project description

twinLab Client

digiLab slack

Headless interface to the twinLab library.

Installation

Most users should use pip

pip install twinlab

If you want to modify the client-side code, or have a local installation, you will need to have git, poetry, and a python version of 3.10 or higher installed. Then you can do:

git clone https://github.com/digiLab-ai/twinLab-client.git
cd twinlab-client
poetry install

Environment setup

You will need a .env file in your project directory that looks like the .env.example file in this repository

cp .env.example .env

and fill in your twinLab user details.

Commands

Pipeline

poetry run python scripts/twinlab/test.py

where test.py can be replaced with any of the scripts in the scripts directory.

Individual examples

Get user information:

poetry run python scripts/twinlab/get_user_information.py

Get version information:

poetry run python scripts/twinlab/get_versions.py

List datasets:

poetry run python scripts/twinlab/list_datasets.py

Upload dataset to the Cloud:

Fill in the arguments (between the angled brackets; < >) with your own dataset and dataset_id (this the filename for the dataset when stored in the Cloud):

poetry run python scripts/twinlab/upload_dataset.py <path/to/dataset.csv> <dataset_id>

For example, using the test biscuits dataset:

poetry run python scripts/twinlab/upload_dataset.py resources/datasets/biscuits.csv biscuits

View dataset that has been uploaded to the Cloud:

poetry run python scripts/twinlab/view_dataset.py <dataset_id>
poetry run python scripts/twinlab/view_dataset.py biscuits

Summarise a dataset on the Cloud:

poetry run python scripts/twinlab/query_dataset.py <dataset_id>
poetry run python scripts/twinlab/query_dataset.py biscuits

List campaigns that you have uploaded to the Cloud:

poetry run python scripts/twinlab/list_campaigns.py

Train campaign on the Cloud:

poetry run python scripts/twinlab/train_campaign.py <path/to/parameters.json> <campaign_id> <processor>
poetry run python scripts/twinlab/train_campaign.py resources/campaigns/biscuits/params.json biscuits-campaign

View campaign details:

poetry run python scripts/twinlab/view_campaign.py <campaign_id>
poetry run python scripts/twinlab/view_campaign.py biscuits-campaign

Summarise trained campaign:

poetry run python scripts/twinlab/query_campaign.py <campaign_id>
poetry run python scripts/twinlab/query_campaign.py biscuits-campaign

Predict using a trained campaign:

poetry run python scripts/twinlab/predict_campaign.py <path/to/inputs.csv> <campaign_id> <method> <processor>
poetry run python scripts/twinlab/predict_campaign.py resources/campaigns/biscuits/eval.csv biscuits-campaign

Delete a campaign from the Cloud:

poetry run python scripts/twinlab/delete_campaign.py <campaign_id>
poetry run python scripts/twinlab/delete_campaign.py biscuits-campaign

Delete a dataset from the Cloud:

poetry run python scripts/twinlab/delete_dataset.py <dataset_id>
poetry run python scripts/twinlab/delete_dataset.py biscuits

Full example

Here we create some mock data (which has a quadratic relationship between X and y) and use twinLab to create a surrogate model with quantified uncertainty.

# Import libraries
import twinlab as tl
import pandas as pd

# Create a dataset and upload to twinLab cloud
df = pd.DataFrame({"X": [1, 2, 3, 4], "y": [1, 4, 9, 16]})
tl.upload_dataset(df, "test-data")

# Train a machine-learning model for the data
params = {
    "dataset_id": "test-data",
    "inputs": ["X"],
    "outputs": ["y"],
}
tl.train_campaign(params, campaign_id="test-model")

# Evaluate the model on some unseen data
df = pd.DataFrame({"X": [1.5, 2.5, 3.5]})
df_mean, df_std = tl.predict_campaign(df, campaign_id="test-model")

Notebooks

Check out the notebooks directory for some additional examples to get started!

Documentation

See the live documentation at https://digilab-ai.github.io/twinLab-client/. Or build a copy locally:

cd docs
yarn install && yarn start

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

twinlab-1.3.1.tar.gz (10.9 kB view details)

Uploaded Source

Built Distribution

twinlab-1.3.1-py3-none-any.whl (11.5 kB view details)

Uploaded Python 3

File details

Details for the file twinlab-1.3.1.tar.gz.

File metadata

  • Download URL: twinlab-1.3.1.tar.gz
  • Upload date:
  • Size: 10.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.11.3 Darwin/22.5.0

File hashes

Hashes for twinlab-1.3.1.tar.gz
Algorithm Hash digest
SHA256 661540b3405563ea8d407a6f655d3eddf58853f259da893408b169c459311104
MD5 5529ddfcc4c179c1cf290c51ce3535cf
BLAKE2b-256 fea7d06d4525f4cfd243a46b1744661f16881c85c66e405b91d6cb7f5080bde5

See more details on using hashes here.

File details

Details for the file twinlab-1.3.1-py3-none-any.whl.

File metadata

  • Download URL: twinlab-1.3.1-py3-none-any.whl
  • Upload date:
  • Size: 11.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.11.3 Darwin/22.5.0

File hashes

Hashes for twinlab-1.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 060e7949a9bfce1a0597a4bc726b2962bfb80f98952ceb4b9a8020ad61769e88
MD5 4879e8ba99c334029c22b7411d95c467
BLAKE2b-256 77e39e4fa080f7397c78405b5b664b03c3ac5a6ee93ec4d6fc77a70aa157370f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page