Skip to main content

AI EasyMaker SDK for Python.

Project description

NHN AI EasyMaker SDK

# Initialize EasyMaker SDK
import easymaker

easymaker.init(
    appkey="EASYMAKER_APPKEY",
    region="kr1",
    secret_key="EASYMAKER_SECRET_KEY",
    experiment_id="EXPERIMENT_ID", # Optional
)

# Create Experiment
experiment = easymaker.Experiment().create(
    experiment_name="experiment_name",
    experiment_description="experiment_description",
    # wait=False
)

# Delete Experiment
experiment.delete()
easymaker.Experiment("experiment_id").delete()
easymaker.experiment.delete("experiment_id")

# Create Training
training = easymaker.Training().run(
    experiment_id=experiment.experiment_id, # Optional if already set in init
    training_name="training_name",
    training_description="training_description",
    train_image_name="Ubuntu 22.04 CPU TensorFlow Training",
    train_instance_name="m2.c4m8",
    distributed_node_count=1,
    data_storage_size=300,  # minimum size : 300G
    source_dir_uri="obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{soucre_download_path}",
    entry_point="training_start.py",
    hyperparameter_list=[
        {
            "parameterName": "epochs",
            "parameterValue": "10",
        },
        {
            "parameterName": "batch-size",
            "parameterValue": "30",
        }
    ],
    timeout_hours=100, # 1~720
    model_upload_uri="obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{model_upload_path}",
    check_point_input_uri="obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{checkpoint_input_path}",
    check_point_upload_uri="obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{checkpoint_upload_path}",
    dataset_list=[
        {
            "datasetName": "train",
            "dataUri": "obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{train_data_download_path}"
        },
        {
            "datasetName": "test",
            "dataUri": "obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{test_data_download_path}"
        }
    ],
    tag_list=[  # maximum 10
        {
            "tagKey": "tag1",
            "tagValue": "test_tag_1",
        },
        {
            "tagKey": "tag2",
            "tagValue": "test_tag_2",
        }
    ],
    use_log=True,
    # wait=False
)

# Create Training By Algorithm (Image Classification)
training = easymaker.Training().run(
    experiment_id=experiment.experiment_id, # Optional if already set in init
    training_name="image_classification",
    training_description="easymaker sdk test training",
    train_image_name="Image Classification CPU",
    train_instance_name="m2.c4m8",
    distributed_node_count=1,
    algorithm_name="Image Classification",
    data_storage_size=300,  # minimum size : 300G
    hyperparameter_list=[
        {
            "parameterName": "input_size",
            "parameterValue": "28",
        },
        {
            "parameterName": "learning_rate",
            "parameterValue": "0.1",
        },
        {
            "parameterName": "per_device_train_batch_size",
            "parameterValue": "16",
        },
        {
            "parameterName": "per_device_eval_batch_size",
            "parameterValue": "16",
        },
        {
            "parameterName": "num_train_epochs",
            "parameterValue": "3",
        }
    ],
    timeout_hours=1,
    model_upload_uri="obs://api-storage.cloud.toast.com/v1/AUTH_{tenant_id}/{container_name}/{model_upload_path}",
    check_point_upload_uri="obs://api-storage.cloud.toast.com/v1/AUTH_{tenant_id}/{container_name}/{checkpoint_upload_path}",
    dataset_list=[
        {
            "datasetName": "train",
            "dataUri": "obs://api-storage.cloud.toast.com/v1/AUTH_{tenant_id}/{container_name}/{train_data_download_path}"
        },
        {
            "datasetName": "validation",
            "dataUri": "obs://api-storage.cloud.toast.com/v1/AUTH_{tenant_id}/{container_name}/{validation_data_download_path}"
        }
    ],
    tag_list=[  # 최대 10개
        {
            "tagKey": "tag1",
            "tagValue": "test_tag_1",
        },
        {
            "tagKey": "tag2",
            "tagValue": "test_tag_2",
        }
    ],
    use_torchrun=True,
    nproc_per_node=1,
    use_log=True,
    # wait=False
)

# Delete Training
training.delete()
easymaker.Training("training_id").delete()
easymaker.training.delete("training_id")

# Create Hyperparameter Tuning
hyperparameter_tuning = easymaker.HyperparameterTuning().run(
    experiment_id=experiment.experiment_id, # Optional if already set in init
    hyperparameter_tuning_name="hyperparameter_tuning_name",
    hyperparameter_tuning_description="hyperparameter_tuning_description",
    image_name="Ubuntu 22.04 CPU TensorFlow Training",
    instance_name="m2.c8m16",
    distributed_node_count=1,
    parallel_trial_count=1,
    data_storage_size=300,
    source_dir_uri="obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{soucre_download_path}",
    entry_point="training_start.py",
    hyperparameter_spec_list=[
        {
            "hyperparameterName": "learning_rate",
            "hyperparameterTypeCode": easymaker.HYPERPARAMETER_TYPE_CODE.DOUBLE,
            "hyperparameterMinValue": "0.01",
            "hyperparameterMaxValue": "0.05",
        },
         {
            "hyperparameterName": "epochs",
            "hyperparameterTypeCode": easymaker.HYPERPARAMETER_TYPE_CODE.INT,
            "hyperparameterMinValue": "100",
            "hyperparameterMaxValue": "1000",
        }
    ],
    timeout_hours=10,
    model_upload_uri="obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{model_upload_path}",
    check_point_input_uri="obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{checkpoint_input_path}",
    check_point_upload_uri="obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{checkpoint_upload_path}",
    dataset_list=[
        {
            "datasetName": "train",
            "dataUri": "obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{train_data_download_path}"
        },
        {
            "datasetName": "test",
            "dataUri": "obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{test_data_download_path}"
        }
    ],
    metric_list=["val_loss", "loss", "accuracy"}],
    metric_regex="([\w|-]+)\s*:\s*([+-]?\d*(\.\d+)?([Ee][+-]?\d+)?)",
    objective_metric_name="val_loss",
    objective_type_code=easymaker.OBJECTIVE_TYPE_CODE.MINIMIZE,
    objective_goal=0.01,
    max_failed_trial_count=3,
    max_trial_count=10,
    tuning_strategy_name=easymaker.TUNING_STRATEGY.BAYESIAN_OPTIMIZATION,
    tuning_strategy_random_state=1,
    early_stopping_algorithm=easymaker.EARLY_STOPPING_ALGORITHM.MEDIAN,
    early_stopping_min_trial_count=3,
    early_stopping_start_step=4,
    tag_list=[
        {
            "tagKey": "tag1",
            "tagValue": "test_tag_1",
        }
    ],
    use_log=True,
    # wait=False,
)

# Create Hyperparameter Tuning By Algorithm (Image Classification)
hyperparameter_tuning = easymaker.HyperparameterTuning().run(
    experiment_id=experiment.experiment_id, # Optional if already set in init
    hyperparameter_tuning_name="hyperparameter_tuning_name",
    algorithm_name="Image Classification",
    image_name="Image Classification CPU",
    instance_name="m2.c2m4",
    distributed_node_count=1,
    parallel_trial_count=1,
    data_storage_size=300,
    hyperparameter_spec_list=[
        {
            "hyperparameterName": "input_size",
            "hyperparameterTypeCode": easymaker.HYPERPARAMETER_TYPE_CODE.DOUBLE,
            "hyperparameterMinValue": "4",
            "hyperparameterMaxValue": "6",
            "hyperparameterStep": "1",
        },
        {
            "hyperparameterName": "learning_rate",
            "hyperparameterTypeCode": easymaker.HYPERPARAMETER_TYPE_CODE.DOUBLE,
            "hyperparameterMinValue": "0",
            "hyperparameterMaxValue": "0.5",
            "hyperparameterStep": "0.1",
        },
        {
            "hyperparameterName": "per_device_train_batch_size",
            "hyperparameterTypeCode": easymaker.HYPERPARAMETER_TYPE_CODE.INT,
            "hyperparameterMinValue": "2",
            "hyperparameterMaxValue": "5",
            "hyperparameterStep": "1",
        },
        {
            "hyperparameterName": "per_device_eval_batch_size",
            "hyperparameterTypeCode": easymaker.HYPERPARAMETER_TYPE_CODE.INT,
            "hyperparameterMinValue": "2",
            "hyperparameterMaxValue": "5",
            "hyperparameterStep": "1",
        },
        {
            "hyperparameterName": "num_train_epochs",
            "hyperparameterTypeCode": easymaker.HYPERPARAMETER_TYPE_CODE.INT,
            "hyperparameterMinValue": "2",
            "hyperparameterMaxValue": "5",
            "hyperparameterStep": "1",
        },
        {
            "hyperparameterName": "save_steps",
            "hyperparameterTypeCode": easymaker.HYPERPARAMETER_TYPE_CODE.INT,
            "hyperparameterMinValue": "1",
            "hyperparameterMaxValue": "1",
            "hyperparameterStep": "1",
        },
        {
            "hyperparameterName": "logging_steps",
            "hyperparameterTypeCode": easymaker.HYPERPARAMETER_TYPE_CODE.INT,
            "hyperparameterMinValue": "1",
            "hyperparameterMaxValue": "1",
            "hyperparameterStep": "1",
        }
    ],
    timeout_hours=1,
    model_upload_uri="obs://api-storage.cloud.toast.com/v1/AUTH_{tenant_id}/{container_name}/{model_upload_path}",
    check_point_upload_uri="obs://api-storage.cloud.toast.com/v1/AUTH_{tenant_id}/{container_name}/{checkpoint_upload_path}",
    dataset_list=[
        {
            "datasetName": "train",
            "dataUri": "obs://api-storage.cloud.toast.com/v1/AUTH_{tenant_id}/{container_name}/{train_data_download_path}"
        },
        {
            "datasetName": "validation",
            "dataUri": "obs://api-storage.cloud.toast.com/v1/AUTH_{tenant_id}/{container_name}/{validation_data_download_path}"
        }
    ],
    tag_list=[
        {
            "tagKey": "tag1",
            "tagValue": "test_tag_1",
        }
    ],
    objective_goal=1,
    max_failed_trial_count=2,
    max_trial_count=3,
    tuning_strategy_name=easymaker.TUNING_STRATEGY.GRID,
    tuning_strategy_random_state=1,
    early_stopping_algorithm=easymaker.EARLY_STOPPING_ALGORITHM.MEDIAN,
    early_stopping_min_trial_count=3,
    early_stopping_start_step=4,
    use_log=True,
    use_torchrun=True,
    nproc_per_node=1,
    # wait=False,
)

# Delete Hyperparameter Tuning
hyperparameter_tuning.delete()
easymaker.HyperparameterTuning("hyperparameter_tuning_id").delete()
easymaker.hyperparameter_tuning().delete("hyperparameter_tuning_id")

# Create Model
model = easymaker.Model().create(
    training_id=training.training_id,  # or hyperparameter_tuning_id=hyperparameter_tuning.hyperparameter_tuning_id,
    model_name="model_name",
    model_description="model_description",
)
model2 = easymaker.Model().create_by_model_upload_uri(
    model_type_code=easymaker.TENSORFLOW,
    model_upload_uri="obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_{tenant_id}/{container_name}/{model_upload_path}",
    model_name="model_name",
    model_description="model_description",
)

# Create Hugging Face Model
model3 = easymaker.Model().create_hugging_face_model(
    model_name="model_name",
    parameter_list=[
        {
            "parameterName": "model_id",
            "parameterValue": "google-bert/bert-base-uncased",
        }
    ],
    model_description="model_description",
    tag_list=[],
)

# Delete Model
model.delete()
easymaker.Model("model_id").delete()
easymaker.model.delete("model_id")

# Create Endpoint
endpoint = easymaker.Endpoint().create(
    model_id=model.model_id,
    endpoint_name="endpoint_name",
    endpoint_description="endpoint_description",
    endpoint_instance_name="c2.c16m16",
    endpoint_model_resource_list=[
        {
            "modelId": model.model_id,
            "resourceOptionDetail": {
                "cpu": "15",
                "memory": "15Gi",
            },
            "description": "test",
        }
    ],
    use_log=True,
    # wait=False,
    # autoscaler_enable=True,  # default False
    # autoscaler_min_node_count=1,
    # autoscaler_max_node_count=10,
    # autoscaler_scale_down_enable=True,
    # autoscaler_scale_down_util_threshold=50,
    # autoscaler_scale_down_unneeded_time=10,
    # autoscaler_scale_down_delay_after_add=10,
)

# Delete Endpoint
endpoint.delete()
easymaker.Endpoint("endpoint_id").delete()
easymaker.endpoint.delete_endpoint("endpoint_id")

# Create Endpoint Stage
endpoint_stage = endpoint.EndpointStage().create(
    stage_name="stage01",
    endpoint_id=endpoint.endpoint_id,
    stage_description="test endpoint",
    endpoint_instance_name="c2.c16m16",
    endpoint_model_resource_list=[
        {
            "modelId": model.model_id,
            "resourceOptionDetail": {
                "cpu": "15",
                "memory": "15Gi",
            },
            "description": "test",
        }
    ],
    endpoint_instance_count=1,
    # wait=False,
    # autoscaler_enable=True,  # default False
    # autoscaler_min_node_count=1,
    # autoscaler_max_node_count=10,
    # autoscaler_scale_down_enable=True,
    # autoscaler_scale_down_util_threshold=50,
    # autoscaler_scale_down_unneeded_time=10,
    # autoscaler_scale_down_delay_after_add=10,
)

# Delete Endpoint Stage
endpoint_stage.delete()
easymaker.EndpointStage("endpoint_stage_id").delete()
easymaker.endpoint.delete_endpoint_stage("endpoint_stage_id")

# Inference
easymaker.EndpointStage("endpoint_stage_id").predict(model_id=model_id, json={"instances": [[6.8, 2.8, 4.8, 1.4]]})

# Delete Endpoint Model
easymaker.EndpointModel("endpoint_model_id").delete()
easymaker.endpoint.delete_endpoint_model("endpoint_model_id")

# Batch Inference
batch_inference = easymaker.BatchInference().run(
    batch_inference_name="test_batch_2",
    instance_count=1,
    timeout_hours=720,
    instance_name="m2.c2m4",
    model_name="model_create_test3",
    #
    pod_count=1,
    batch_size=120,
    inference_timeout_seconds=120,
    #
    input_data_uri="obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_bd47a7932e3f464982cef083a7780f94/dev-test/batch_inference/input/case_1_500_record",
    input_data_type="JSONL",
    include_glob_pattern=None,
    exclude_glob_pattern=None,
    output_upload_uri=f"obs://kr1-api-object-storage.nhncloudservice.com/v1/AUTH_33634be0ec1340f3aa966a610eea77f0/model/batch_inference/output",
    #
    data_storage_size=300,
    #
    description=None,
    tag_list=None,
    use_log=False,
    wait=True,
)

# Delete Batch Inference
endpoint.delete()
easymaker.BatchInference("batch_inference_id").delete()
easymaker.batch_inference.delete("batch_inference_id")

# Upload Pipeline
pipeline = easymaker.Pipeline().upload(
    pipeline_name="pipeline_01",
    pipeline_spec_manifest_path="./sample-pipeline.yaml",
    description="test",
    tag_list=[],
)

# Delete Pipeline
pipeline.delete()
easymaker.Pipeline("pipeline_id").delete()
easymaker.pipeline.delete("pipeline_id")

# Create Pipeline Run
pipeline_run = easymaker.PipelineRun().create(
    pipeline_run_name="pipeline_run",
    description="test",
    pipeline_id=pipeline.pipeline_id,
    experiment_id=experiment.experiment_id, # Optional if already set in init
    instance_name="m2.c2m4",
    instance_count=1,
    boot_storage_size=50,
)

# Delete Pipeline Run
easymaker.PipelineRun("pipeline_run_id").delete()
easymaker.pipeline_run.delete("pipeline_run_id")

# Create Pipeline Recurring Run
pipeline_recurring_run = easymaker.PipelineRecurringRun().create(
    pipeline_recurring_run_name="pipeline_recurring_run",
    description="test",
    pipeline_id=pipeline.pipeline_id,
    experiment_id=experiment.experiment_id, # Optional if already set in init
    instance_name="m2.c2m4",
    instance_count=1,
    boot_storage_size=50,
    schedule_periodic_minutes=60,
    max_concurrency_count=1,
)

# Stop Pipeline Recurring Run
pipeline_recurring_run.stop()

# Start Pipeline Recurring Run
pipeline_recurring_run.start()

# Delete Pipeline Recurring Run
pipeline_recurring_run.delete()
easymaker.PipelineRecurringRun("pipeline_recurring_run_id").delete()
easymaker.pipeline_recurring_run.delete("pipeline_recurring_run_id")

# Log (Log & Crash)
easymaker_logger = easymaker.logger(logncrash_appkey="log&crash_product_app_key")
easymaker_logger.send("test log meassage")  # Output to stdout & send log to log&crash product
easymaker_logger.send(log_message="log meassage",
                      log_level="INFO",  # default: INFO
                      project_version="1.0.0",  # default: 1.0.0
                      parameters={"serviceType": "EasyMakerSample"})  # Add custom parameters


# NHN Cloud ObjectStorage download, upload, delete
easymaker_obs = easymaker.ObjectStorage(
    easymaker_region="kr1",
    username="username@nhn.com",
    password="nhn_object_storage_api_password"
)

easymaker_obs.download(
    easymaker_obs_uri="obs://api-storage.cloud.toast.com/v1/AUTH_{tenant_id}/{container_name}/{source_dir}",
    download_dir_path="./source_dir",
)

easymaker_obs.upload(
    easymaker_obs_uri="obs://api-storage.cloud.toast.com/v1/AUTH_{tenant_id}/{container_name}/{upload_path}",
    local_path="./local_dir",
)

easymaker_obs.delete(
    easymaker_obs_uri="obs://api-storage.cloud.toast.com/v1/AUTH_{tenant_id}/{container_name}/{object_path}",
    # file_extension=".json",  # Delete files with specific extensions
)

CLI Command

  • instance type list : python -m easymaker -instance --region kr1 --appkey EM_APPKEY --secret_key EM_SECRET_KEY
  • image list : python -m easymaker -image --region kr1 --appkey EM_APPKEY --secret_key EM_SECRET_KEY
  • experiment list : python -m easymaker -experiment --region kr1 --appkey EM_APPKEY --secret_key EM_SECRET_KEY
  • training list : python -m easymaker -training --region kr1 --appkey EM_APPKEY --secret_key EM_SECRET_KEY
  • hyperparameter tuning list : python -m easymaker -tuning --region kr1 --appkey EM_APPKEY --secret_key EM_SECRET_KEY
  • model list : python -m easymaker -model --region kr1 --appkey EM_APPKEY --secret_key EM_SECRET_KEY
  • endpoint list : python -m easymaker -endpoint --region kr1 --appkey EM_APPKEY --secret_key EM_SECRET_KEY

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

easymaker-2.0.1a0-py3-none-any.whl (38.8 kB view details)

Uploaded Python 3

File details

Details for the file easymaker-2.0.1a0-py3-none-any.whl.

File metadata

  • Download URL: easymaker-2.0.1a0-py3-none-any.whl
  • Upload date:
  • Size: 38.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.15

File hashes

Hashes for easymaker-2.0.1a0-py3-none-any.whl
Algorithm Hash digest
SHA256 a7e93629d8776bd4d2a7abf30784c2fec382fe567cd391098156220aa2721cc4
MD5 08ab167a6c585298529925562df583bc
BLAKE2b-256 777a9e1fa2844b179c50f4d4e6e59a95c37d77de2d801e91fefb9e9fdaa5621e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page