Skip to main content

Client interface for twinLab machine-learning in the cloud.

Project description

twinLab Client

digiLab slack

Headless interface to the twinLab library.

Installation

Most users should use pip

pip install twinlab

If you want to modify the client-side code, or have a local installation, you will need to have git, poetry, and a python version of 3.10 or higher installed. Then you can do:

git clone https://github.com/digiLab-ai/twinLab-client.git
cd twinlab-client
poetry install

Environment setup

You will need a .env file in your project directory that looks like the .env.example file in this repository

cp .env.example .env

and fill in your twinLab user details.

Commands

Pipeline

poetry run python scripts/twinlab/test.py

where test.py can be replaced with any of the scripts in the scripts directory.

Individual examples

Get user information:

poetry run python scripts/twinlab/get_user_information.py

Get version information:

poetry run python scripts/twinlab/get_versions.py

List datasets:

poetry run python scripts/twinlab/list_datasets.py

Upload dataset to the Cloud:

Fill in the arguments (between the angled brackets; < >) with your own dataset and dataset_id (this the filename for the dataset when stored in the Cloud):

poetry run python scripts/twinlab/upload_dataset.py <path/to/dataset.csv> <dataset_id>

For example, using the test biscuits dataset:

poetry run python scripts/twinlab/upload_dataset.py resources/datasets/biscuits.csv biscuits

View dataset that has been uploaded to the Cloud:

poetry run python scripts/twinlab/view_dataset.py <dataset_id>
poetry run python scripts/twinlab/view_dataset.py biscuits

Summarise a dataset on the Cloud:

poetry run python scripts/twinlab/query_dataset.py <dataset_id>
poetry run python scripts/twinlab/query_dataset.py biscuits

List campaigns that you have uploaded to the Cloud:

poetry run python scripts/twinlab/list_campaigns.py

Train campaign on the Cloud:

poetry run python scripts/twinlab/train_campaign.py <path/to/parameters.json> <campaign_id> <processor>
poetry run python scripts/twinlab/train_campaign.py resources/campaigns/biscuits/params.json biscuits-campaign

View campaign details:

poetry run python scripts/twinlab/view_campaign.py <campaign_id>
poetry run python scripts/twinlab/view_campaign.py biscuits-campaign

Summarise trained campaign:

poetry run python scripts/twinlab/query_campaign.py <campaign_id>
poetry run python scripts/twinlab/query_campaign.py biscuits-campaign

Predict using a trained campaign:

poetry run python scripts/twinlab/predict_campaign.py <path/to/inputs.csv> <campaign_id> <method> <processor>
poetry run python scripts/twinlab/predict_campaign.py resources/campaigns/biscuits/eval.csv biscuits-campaign

Delete a campaign from the Cloud:

poetry run python scripts/twinlab/delete_campaign.py <campaign_id>
poetry run python scripts/twinlab/delete_campaign.py biscuits-campaign

Delete a dataset from the Cloud:

poetry run python scripts/twinlab/delete_dataset.py <dataset_id>
poetry run python scripts/twinlab/delete_dataset.py biscuits

Full example

Here we create some mock data (which has a quadratic relationship between X and y) and use twinLab to create a surrogate model with quantified uncertainty.

# Import libraries
import twinlab as tl
import pandas as pd

# Create a dataset and upload to twinLab cloud
df = pd.DataFrame({"X": [1, 2, 3, 4], "y": [1, 4, 9, 16]})
tl.upload_dataset(df, "test-data")

# Train a machine-learning model for the data
params = {
    "dataset_id": "test-data",
    "inputs": ["X"],
    "outputs": ["y"],
}
tl.train_campaign(params, campaign_id="test-model")

# Evaluate the model on some unseen data
df = pd.DataFrame({"X": [1.5, 2.5, 3.5]})
df_mean, df_std = tl.predict_campaign(df, campaign_id="test-model")

Notebooks

Check out the notebooks directory for some additional examples to get started!

Documentation

See the live documentation at https://digilab-ai.github.io/twinLab-client/. Or build a copy locally:

cd docs
yarn install && yarn start

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

twinlab-1.3.2.tar.gz (11.1 kB view details)

Uploaded Source

Built Distribution

twinlab-1.3.2-py3-none-any.whl (11.7 kB view details)

Uploaded Python 3

File details

Details for the file twinlab-1.3.2.tar.gz.

File metadata

  • Download URL: twinlab-1.3.2.tar.gz
  • Upload date:
  • Size: 11.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.6.1 CPython/3.11.5 Darwin/22.6.0

File hashes

Hashes for twinlab-1.3.2.tar.gz
Algorithm Hash digest
SHA256 3e13f8b7db5a1aa595a51f9ffa110b413c49008573b6cf9bf20ae56100ee7c04
MD5 ab6d7b28bba402647e7b95cd15ce9d18
BLAKE2b-256 fa2970bf55845cb5924087ee476cf59e67ba57d3633c5f230154af0419f73283

See more details on using hashes here.

File details

Details for the file twinlab-1.3.2-py3-none-any.whl.

File metadata

  • Download URL: twinlab-1.3.2-py3-none-any.whl
  • Upload date:
  • Size: 11.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.6.1 CPython/3.11.5 Darwin/22.6.0

File hashes

Hashes for twinlab-1.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 e1024c3c05ee127f3a67dffd945fe9f44a9ff73a6fd63dd8e77bbe6d03faa88e
MD5 e7b1122041f48c781122878128a3f1cd
BLAKE2b-256 5f9d3e69eeb4a646f4d34db147cc4ec4b64a71998c6188df6e067e263034c3e5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page