No project description provided
Project description
Picsellia Pipelines CLI
The Picsellia Pipelines CLI lets you quickly create, test, dockerize, deploy, and manage custom processing or training pipelines.
How it works
A pipeline is simply:
-
a Python function (
pipeline.py) -
calling steps (
steps.py) -
configured via
run_config.toml -
run locally or on Picsellia infrastructure
Workflow at a glance
-
Init → generate project template
-
Customize → implement steps & parameters
-
Test → run locally
-
Smoke Test → validate Docker image
-
Deploy — Publish Your Pipeline to Picsellia → push & register pipeline
1. Installation
Show installation instructions
With uv (recommended)
uv pip install picsellia-pipelines-cli
With Poetry:
poetry add picsellia-pipelines-cli
Check installation:
pxl-pipeline --help
2. Authentication
Objective: Use the same Picsellia user and environment across all commands
pxl-pipeline login
Show detailed explanation
This stores your:
- organization
- environment (PROD / BETA / DEV / CUSTOM)
- API token
- optional custom base URL
Other helpful commands:
pxl-pipeline whoami # show active context
pxl-pipeline switch # change organization/environment
pxl-pipeline logout # clear active context
3. Init — Create a new pipeline
Objective: Generate a ready-to-use pipeline project folder with all required template files
pxl-pipeline init <pipeline_name> --type [training|processing] --template <template_name>
Show detailed explanation
Examples:
pxl-pipeline init yolov8 --type training --template yolov8
pxl-pipeline init resize-images --type processing --template dataset_version_creation
This generates:
-
a single entrypoint:
pipeline.py -
config.toml(metadata + execution parameters) -
Dockerfileand.dockerignore -
a consistent folder structure:
my-pipeline/
├── pipeline.py
├── steps.py
├── utils/
│ ├── parameters.py
├── config.toml
├── Dockerfile
├── .dockerignore
├── runs/
│ └── run_config.toml # template for test/smoke-test/launch
└── pyproject.toml
You're now ready to implement your custom logic.
4. Customize your pipeline — Add steps & parameters
Objective: Adapt the pipeline template to your specific use case
After running init, your pipeline project is generated with a default structure:
-
pipeline.py— your entrypoint -
steps.py(a default process step) — implement processing or training steps -
utils/parameters.py— define your pipeline parameters
👉 In most cases, if you chose the right template, you only need to modify the existing process step to implement your use case.
You do not need to redesign the whole pipeline unless your logic is more advanced.
Recommended approach
-
Start by editing the existing
processstepThis step already exposes the correct inputs, outputs, and parameters expected by Picsellia.
-
Only add new steps if needed
If your pipeline requires additional logic (pre-processing, post-processing, custom validation, chaining operations, etc.), you can:
- modify the existing step
- replace it entirely
- or add new steps and compose them in
pipeline.py
Working with Steps
Show detailed explanation
Steps are Python functions decorated with @step.
You can define them in steps.py and call them freely inside pipeline.py.
Example (steps.py):
from picsellia_pipeline.core import step
@step
def process(dataset_input, dataset_output):
# your processing logic here
dataset_output["images"] = [img.upper() for img in dataset_input["images"]]
return dataset_output
Example pipeline (pipeline.py):
from steps import process
from picsellia_pipeline.core import pipeline, step
@pipeline
def dataset_version_creation_pipeline():
dataset_collection = load_coco_datasets()
dataset_collection["output"] = process(
dataset_collection["input"], dataset_collection["output"]
)
upload_full_dataset(dataset_collection["output"], use_id=False)
return dataset_collection
Adding Parameters
Show detailed explanation
Define pipeline parameters in utils/parameters.py.
All parameters declared here are:
-
automatically detected by the CLI
-
injected at runtime
-
uploaded to Picsellia during deploy or sync
Example:
from picsellia.types.schemas import LogDataType
from picsellia_cv_engine.core.parameters import Parameters
class ProcessingParameters(Parameters):
def __init__(self, log_data: LogDataType):
super().__init__(log_data=log_data)
self.datalake = self.extract_parameter(["datalake"], expected_type=str, default="default")
self.data_tag = self.extract_parameter(["data_tag"], expected_type=str, default="processed")
Each parameter requires:
-
a key → used as the parameter name on the Picsellia platform
-
an expected type → str, int, float, etc.
-
a default value → mandatory for parameter registration
Once parameters are defined, you can reference them directly in your step logic and override their values in run_config.toml.
5. Test — Run your pipeline locally
Objective: Ensure your Python code works exactly as expected, using real Picsellia objects
pxl-pipeline test <pipeline_name> --run-config-file <path>
Show detailed explanation
⚠️ Important
Even though the pipeline runs locally, all datasets, experiments, and outputs are created and updated on the Picsellia platform.
This command:
- runs the pipeline locally in the virtual env (.venv/)
- loads the configuration from your run_config.toml
- interacts with real Picsellia objects
- uploads results to the platform exactly like a real run
- guarantees your step logic and parameters behave correctly
A template run config is generated automatically at:
<pipeline_name>/runs/run_config.toml
You simply need to fill it with your dataset/model IDs, parameters, or metadata.
Example (dataset version creation processing):
override_outputs = true
[job]
type = "DATASET_VERSION_CREATION"
[input.dataset_version]
id = ""
[output.dataset_version]
name = "test_my_pipeline"
[parameters]
datalake = "default"
data_tag = "processed"
Once the file is filled:
pxl-pipeline test my-pipeline \
--run-config-file my-pipeline/runs/run_config.toml
6. Smoke Test — Validate the Docker runtime
Objective: Ensure your Dockerfile, dependencies, imports, paths, and runtime fully work before deployment
pxl-pipeline smoke-test <pipeline_name> --run-config-file <path>
Show detailed explanation
This command:
- Builds the Docker image
- Runs the pipeline inside the container (not Python locally)
- Uses the same run_config.toml as the test command
- Updates real objects/results on Picsellia
It is your final validation step before deployment.
A successful smoke test strongly indicates that the pipeline will run properly on Picsellia’s infrastructure.
7. Deploy — Publish your pipeline to Picsellia
Objective: Build, version, push the Docker image, and register/update the pipeline in your organization
pxl-pipeline deploy <pipeline_name>
Show detailed explanation
This command:
- builds the Docker image
- pushes it to your configured registry
- versions the image
- creates or updates the Picsellia processing or training asset
- ensures the pipeline is ready to be launched from the UI
After deployment, the pipeline becomes usable by your team in the Picsellia interface.
Launch — Run your pipeline on Picsellia’s infrastructure
(Optional)
Objective: Trigger a real Picsellia job (not local), using the same run_config.toml
pxl-pipeline launch <pipeline_name> --run-config-file <path>
Show detailed explanation
Launch behaves like:
- launching a processing job on a dataset
- or launching a training experiment
- without manually creating an experiment or job in the UI
The run_config.toml defines:
- the dataset/model input
- the output dataset or experiment name
- the pipeline parameters
This is equivalent to triggering an actual job from the Picsellia UI.
Sync — Synchronize local parameters with Picsellia
(Optional)
Objective: Update parameters stored on Picsellia to match your local
pxl-pipeline sync <pipeline_name>
Show detailed explanation
For processing pipelines, this syncs:
- default parameter values
- parameter schema / types
Sync is usually unnecessary if you run:
pxl-pipeline deploy
because deploy already updates the parameter definition on the platform.
Training sync is not yet implemented.
Tips
- Use
--output-dirduring init to generate the pipeline elsewhere - Virtual environments are created in
<pipeline>/.venv - You can edit
config.tomlat any time (metadata, entrypoints, dependencies) - Always run
test→smoke-test→deployfor a clean workflow - A successful smoke test almost guarantees a successful production run
Made with ❤️ by the Picsellia team.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file picsellia_pipelines_cli-0.7.1.tar.gz.
File metadata
- Download URL: picsellia_pipelines_cli-0.7.1.tar.gz
- Upload date:
- Size: 67.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c3ca7295638ce177f4f3d77d156de9ced38cf762be8c9fb6ea8204b2f96c6f27
|
|
| MD5 |
df1e4c3e219a9799a5a40a0fafeb1ad8
|
|
| BLAKE2b-256 |
46881da48c9aba2ea39966253a81c30315ddbe9b98bb181aea5ec933dcde8085
|
File details
Details for the file picsellia_pipelines_cli-0.7.1-py3-none-any.whl.
File metadata
- Download URL: picsellia_pipelines_cli-0.7.1-py3-none-any.whl
- Upload date:
- Size: 91.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d625ff244980044e6cc776257148ada870e92d27b7c48862aeb0dea03d9d5525
|
|
| MD5 |
57d89e9cac8ef75d18262ae0797f1bd5
|
|
| BLAKE2b-256 |
ef6ed4e81f5ad48241c12116d39a2a995037a576f22f02e5e30fe79455e9447f
|