Skip to main content

This is the API client of Open Innovation Platform - MLOPS

Project description

Open Innovation MLOps Client API

Welcome to the Open Innovation MLOps Client API documentation! This guide offers detailed instructions on how to install, set up, and use the client library.

Installation and Setup

To use the Open Innovation MLOps Client in your project, follow these steps:

  1. Install the client library: Use the following pip command to add the Open Innovation MLOps Client to your Python environment:

    pip install oip-mlops-client
    
  2. Import the library: Add the following import statement to your Python script to gain access to the MLOps class from the client library:

    from oip_mlops_client.mlops import MLOps
    

Client Initialization and Tracking

To start using the Open Innovation MLOps Client in your application, you need to initialize it with specific details about your environment.

api_host = "api_host"
username = "your_username"
password = "your_password"
workspace_name = "target_workspace_name"
MLOps.connect(api_host, username, password, workspace_name)

Parameters

  • api_host (str, required): The hostname of the API server.
  • username (str, required): Your username.
  • password (str, required): Your password.
  • workspace_name (str, required): The name of the workspace you want to connect to. This workspace should already exist on the platform's user interface (UI).

OAfter initializing the client and establishing the connection, you can specify the experiment you want to track:

experiment_name = "target_experiment_name"
MLOps.set_experiment(experiment_name)

Parameters

  • experiment_name (str, required): The name of the experiment you want to track. This experiment should already exist on the platform's UI.

Once the target experiment is set, the API client is ready to start tracking your runs.

Compatibility with Mlflow for Tracking

The MLOps client provides access to all the methods available through the MLflow API. For instance:

MLOps.autolog() is equivalent to mlflow.autolog()

For more information about the available methods, refer to the MLflow official documentation.

Advanced Artifacts Tracking

In addition to all the features offered by MLflow, our solution also enables tracking of specific types of artifacts like images, audio, video, figures, text, JSON, etc., for specific machine learning training tasks analysis. This allows for sophisticated and advanced analytics and visualizations through our platform UI.

Image Tracking

The log_image_at_step method accepts an image as a numpy.ndarray or rom PIL.Image.Image:

from PIL import Image

# Load or create your image as numpy.ndarray or PIL.Image
image_data = Image.open("test_image.jpg")

# Log image at specific step
extra = {"description": "test image"}
MLOps.log_image_at_step(image_data, 'image_file.jpg', 1, extra)

Please note that for images, you can directly pass the path to the image file.

Audio Tracking

The log_audio_at_step method accepts audio data as a numpy.ndarray:

import numpy as np

# Create or load your audio data as a numpy array
audio_data = np.random.random(1000)

# Log audio at specific step
MLOps.log_audio_at_step(audio_data, 'audio_file.wav', 1, rate=44100)

The audio_data should be a numpy array. If the audio is stereo, the array should be 2-dimensional.

Text Tracking

The log_text_at_step method accepts text as a str:

# Log text at specific step
text_data = "This is a sample text."
MLOps.log_text_at_step(text_data, 'text_file.txt', 1)

Figure Tracking

The log_figure_at_step method accepts a figure as a matplotlib.figure.Figure or plotly.graph_objects.Figure:

import matplotlib.pyplot as plt

# Create a figure
fig, ax = plt.subplots()
ax.plot([1, 2, 3, 4], [1, 4, 2, 3])

# Log figure at specific step
MLOps.log_figure_at_step(fig, 'figure_file.jpg', 1)

The fig should be a matplotlib.figure.Figure object.

JSON Tracking

The log_dict_at_step method accepts a dictionary or list to be logged as JSON:

# Log dictionary at specific step
dict_data = {"key1": "value1", "key2": "value2"}
MLOps.log_dict_at_step(dict_data, 'dict_file.json', 1)

The dictionary or list will be saved as a JSON file.

Extra Parameters

Note: All log_*_at_step methods accept an optional extra parameter (of type dict) which can be used to log additional metadata about the artifact, and a file_name (of type str) that specifies the name of the artifact file. For log_audio_at_step, there is also a rate parameter (of type int) to specify the sample rate of the audio data.

The extra parameter should be a dictionary with string keys. The values can be of types int, float, str, bool, list, or None.

extra = {"description": "This is a description of the artifact."}

In the case of log_audio_at_step, there's also a rate parameter to specify the sample rate of the audio data.

MLOps.log_audio_at_step(audio_data, 'audio_file', 1, rate=44100, extra=extra)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

oip_mlops_client-0.0.5-py3-none-any.whl (17.1 kB view details)

Uploaded Python 3

File details

Details for the file oip_mlops_client-0.0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for oip_mlops_client-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 ad665b235d3e4038baa29a910d2cb8206550c87ec83d0d0282e5e66a4f65aa28
MD5 905bf2967c79b0faa4fee95c2886bc39
BLAKE2b-256 8ab50bbad7535d130f3e7ecca646a5816370eeca5587b026b27987588382bc10

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page