Skip to main content

python sdk for Instill AI products

Project description

Unix Build Status Coverage Status PyPI License PyPI Version PyPI Downloads

[!IMPORTANT]
This SDK tool is under active development
For any bug found or featur request, feel free to open any issue regarding this SDK in our instill-core repo.

Overview

Welcome to Instill Python SDK, where the world of AI-first application comes alive in the form of Python.

Before you jump into creating your first application with this SDK tool, we recommend you to get familiar with the core concepts of Instill Product first. You can check out our documentation here:

Setup

[!NOTE]
For setting up development environment, please refer to Contributing

Requirements

  • Python 3.8 - 3.11

Installation

[!WARNING]
If your host machine is on arm64 architecture(including Apple silicon machines, equipped with m1/m2 processors), there are some issues when installing grpcio within conda environment. You will have to manually build and install it like below. Read more about this issue here.

$ GRPC_PYTHON_LDFLAGS=" -framework CoreFoundation" pip install grpcio --no-binary :all:

Install it directly into an activated virtual environment:

$ pip install instill-sdk

or add it to your Poetry project:

$ poetry add instill-sdk

Check import

After installation, you can check if it has been installed correctly:

$ python
>>> import instill
>>> instill.__version__

Config Instill Core or Instill Cloud instance

Before we can start using this SDK, you will need to properly config your target instance. We support two ways to setup the configs, which are

Config file

create a config file under this path ${HOME}/.config/instill/sdk/python/config.yml, and within that path you will need to fill in some basic parameters for your desired host.[^1]

[^1]: You can obtain an api_token by simply going to Settings > API Tokens page from the console, no matter it is Instill Core or Instill Cloud.

Within the config file, you can define multiple instances with the alias of your liking, later in the SDK you can refer to this alias to switch between instances.[^2]

[^2]: SDK is default to look for instance named default first, and will fall back to the first instance entry in the config file if default not found

hosts:
  alias1:
    url:    str
    secure: bool
    token:  str
  alias2:
    url:    str
    secure: bool
    token:  str
  ...
  ...

Example:

hosts:
  default:
    url: localhost:8080
    secure: false
    token: instill_sk***
  cloud:
    url: api.instill.tech
    secure: true
    token: instill_sk***

At runtime

If you do not like the idea of having to create a config file, you can also setup your target instance by doing the following at the very beginning of your script.

from instill.configuration import global_config

global_config.set_default(
    url="api.instill.tech",
    token="instill_sk***",
    secure=True,
)

Usage

Before we get into this, please make sure a local instance of Instill VDP and Instill Model is running, and the config file had been populated with the correct url and api_token

Let's get started!

Import packages

To Form a pipeine, it required a start operator and a end operator, we have helper functions to create both

from instill.clients import InstillClient

Get the client

Get the unified client that connect to all the available services offered by Instill VDP and Instill Model, including

  • mgmt_service
  • pipeline_service
  • model_service
  • artifact_service
client = InstillClient()

user = client.mgmt_service.get_user()
# name: "users/admin"
# uid: "4767b74d-640a-4cdf-9c6d-7bb0e36098a0"
# id: "admin"
# ...
# ...

Please find more usages for this sdk at here

You can also find some notebook examples here

Create a model

Now create a model text-generation in Instill Model for later use

import instill.protogen.common.task.v1alpha.task_pb2 as task_interface
model_id = "model_text-generation"
client.model_service.create_model(
    model_id,
    task_interface.Task.TASK_TEXT_GENERATION,
    "REGION_GCP_EUROPE_WEST4",
    "CPU",
    "model-definitions/container",
    {},
)

Build and deploy the model

Instill Model is an advanced MLOps/LLMOps platform that was specifically crafted to facilitate the efficient management and orchestration of model deployments for unstructured data ETL. With Instill Model, you can easily create, manage, and deploy your own custom models with ease in Instill Core or on the cloud with Instill Cloud.

Follow the instructions here to build and deploy your model.

Create pipeline

In the section we will be creating a pipeline using this python-sdk to harness the power of Instill VDP!

The pipeline receipt below is a sample for demo. It simply returns the input string value.

pipeline_id = "pipeline_demo"
client.pipeline_service.create_pipeline(
    pipeline_id,
    "this is a pipeline for demo",
    {
        "output": {"result": {"title": "result", "value": "${variable.input}"}},
        "variable": {"input": {"instillFormat": "string", "title": "input"}},
    },
)

Validate the pipeline

Before we trigger the pipeline, it is recommended to first validate the pipeline recipe first

# validate the pipeline recipe
client.pipeline_service.validate_pipeline(pipeline_id)

Trigger the pipeline

Finally the pipeline is done, now let us test it by triggering it!

# we can trigger the pipeline now
client.pipeline_service.trigger_pipeline(pipeline_id, [], [{"input": "hello world"}])

And the output should be exactly the same as your input.

Contributing

Please refer to the Contributing Guidelines for more details.

Community support

Please refer to the community repository.

License

See the LICENSE file for licensing information.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

instill_sdk-0.15.1.tar.gz (307.6 kB view details)

Uploaded Source

Built Distribution

instill_sdk-0.15.1-py3-none-any.whl (369.4 kB view details)

Uploaded Python 3

File details

Details for the file instill_sdk-0.15.1.tar.gz.

File metadata

  • Download URL: instill_sdk-0.15.1.tar.gz
  • Upload date:
  • Size: 307.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.10 Linux/6.5.0-1025-azure

File hashes

Hashes for instill_sdk-0.15.1.tar.gz
Algorithm Hash digest
SHA256 68fc637bb2de9343cb4f58726adf5816f69a710dd5b3fbc865d07c1954d556e2
MD5 96992dad496f097f9da9207660eb05a6
BLAKE2b-256 7b7b59f67b73c841e548c1b4ce7080b64b0391f2a6063e0d1c2fb07bc40e512a

See more details on using hashes here.

File details

Details for the file instill_sdk-0.15.1-py3-none-any.whl.

File metadata

  • Download URL: instill_sdk-0.15.1-py3-none-any.whl
  • Upload date:
  • Size: 369.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.10 Linux/6.5.0-1025-azure

File hashes

Hashes for instill_sdk-0.15.1-py3-none-any.whl
Algorithm Hash digest
SHA256 521d5bd23e7912dda7b055df3343042e0e38fc2be030c40d64dc52b882905948
MD5 1bdd683ae0ab6d9bc06b74b79c8a413d
BLAKE2b-256 704747beb763868e0465be2e12db3125945f613f27cd42b45ce90e8e1e6921bc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page