python sdk for Instill AI products
Project description
[!IMPORTANT]
This SDK tool is under active development
For any bug found or featur request, feel free to open any issue regarding this SDK in our instill-core repo.
Overview
Welcome to Instill Python SDK, where the world of AI-first application comes alive in the form of Python.
Before you jump into creating your first application with this SDK tool, we recommend you to get familiar with the core concepts of Instill Product first. You can check out our documentation here:
Setup
[!NOTE]
For setting up development environment, please refer to Contributing
Requirements
- Python 3.8 - 3.11
Installation
[!WARNING]
If your host machine is on arm64 architecture(including Apple silicon machines, equipped with m1/m2 processors), there are some issues when installinggrpcio
withinconda
environment. You will have to manually build and install it like below. Read more about this issue here.
$ GRPC_PYTHON_LDFLAGS=" -framework CoreFoundation" pip install grpcio --no-binary :all:
Install it directly into an activated virtual environment:
$ pip install instill-sdk
or add it to your Poetry project:
$ poetry add instill-sdk
Check import
After installation, you can check if it has been installed correctly:
$ python
>>> import instill
>>> instill.__version__
Config Instill Core
or Instill Cloud
instance
Before we can start using this SDK, you will need to properly config your target instance. We support two ways to setup the configs, which are
Config file
create a config file under this path ${HOME}/.config/instill/sdk/python/config.yml
, and within that path you will need to fill in some basic parameters for your desired host.[^1]
[^1]: You can obtain an api_token
by simply going to Settings > API Tokens page from the console, no matter it is Instill Core
or Instill Cloud
.
Within the config file, you can define multiple instances with the alias
of your liking, later in the SDK you can refer to this alias
to switch between instances.[^2]
[^2]: SDK is default to look for instance named default
first, and will fall back to the first instance entry in the config file if default
not found
hosts:
alias1:
url: str
secure: bool
token: str
alias2:
url: str
secure: bool
token: str
...
...
Example:
hosts:
default:
url: localhost:8080
secure: false
token: instill_sk***
cloud:
url: api.instill.tech
secure: true
token: instill_sk***
At runtime
If you do not like the idea of having to create a config file, you can also setup your target instance by doing the following at the very beginning of your script.
from instill.configuration import global_config
global_config.set_default(
url="api.instill.tech",
token="instill_sk***",
secure=True,
)
Usage
Before we get into this, please make sure a local instance of Instill VDP
and Instill Model
is running, and the config file had been populated with the correct url
and api_token
Let's get started!
Import packages
To Form a pipeine, it required a start
operator and a end
operator, we have helper functions to create both
from instill.clients import InstillClient
Get the client
Get the unified client that connect to all the available services offered by Instill VDP
and Instill Model
, including
- mgmt_service
- pipeline_service
- model_service
- artifact_service
client = InstillClient()
user = client.mgmt_service.get_user()
# name: "users/admin"
# uid: "4767b74d-640a-4cdf-9c6d-7bb0e36098a0"
# id: "admin"
# ...
# ...
Please find more usages for this sdk at here
You can also find some notebook examples here
Create a model
Now create a model text-generation
in Instill Model
for later use
import instill.protogen.common.task.v1alpha.task_pb2 as task_interface
model_id = "model_text-generation"
client.model_service.create_model(
model_id,
task_interface.Task.TASK_TEXT_GENERATION,
"REGION_GCP_EUROPE_WEST4",
"CPU",
"model-definitions/container",
{},
)
Build and deploy the model
Instill Model
is an advanced MLOps/LLMOps platform that was specifically crafted to facilitate the efficient management and orchestration of model deployments for unstructured data ETL. With Instill Model
, you can easily create, manage, and deploy your own custom models with ease in Instill Core
or on the cloud with Instill Cloud
.
Follow the instructions here to build and deploy your model.
Create pipeline
In the section we will be creating a pipeline using this python-sdk
to harness the power of Instill VDP
!
The pipeline receipt below is a sample for demo. It simply returns the input string value.
pipeline_id = "pipeline_demo"
client.pipeline_service.create_pipeline(
pipeline_id,
"this is a pipeline for demo",
{
"output": {"result": {"title": "result", "value": "${variable.input}"}},
"variable": {"input": {"instillFormat": "string", "title": "input"}},
},
)
Validate the pipeline
Before we trigger the pipeline, it is recommended to first validate the pipeline recipe first
# validate the pipeline recipe
client.pipeline_service.validate_pipeline(pipeline_id)
Trigger the pipeline
Finally the pipeline is done, now let us test it by triggering it!
# we can trigger the pipeline now
client.pipeline_service.trigger_pipeline(pipeline_id, [], [{"input": "hello world"}])
And the output should be exactly the same as your input.
Contributing
Please refer to the Contributing Guidelines for more details.
Community support
Please refer to the community repository.
License
See the LICENSE file for licensing information.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for instill_sdk-0.15.1rc1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | fd4fbc1ada3b77a43ed0f6bd3466c055d90bd4d9a08c0cc233d49e71d4adf9ef |
|
MD5 | 42351f89c3969867a7923fe5d2d7f1dd |
|
BLAKE2b-256 | 33247e2b16f0340cfda299b27fe5c8f3b737ea5572f36037bb9bc44c51e5d833 |