Skip to main content

synapse sdk

Project description

Synapse SDK v2

To be merged into synapse-sdk after development

Table of Contents


Installation

pip install synapse-sdk

Migration Guide

Plugin Utils

Old (synapse-sdk v1):

from synapse_sdk.plugins.utils import get_action_class, get_plugin_actions, read_requirements

# Get run method by loading the action class
action_method = get_action_class(config['category'], action).method

New (synapse-sdk v2):

from synapse_sdk.plugins.utils import get_action_method, get_plugin_actions, read_requirements

# Get run method directly from config (no class loading needed)
action_method = get_action_method(config, action)

Plugin Types

Old:

from synapse_sdk.plugins.enums import PluginCategory
from synapse_sdk.plugins.base import RunMethod

New:

from synapse_sdk.plugins.enums import PluginCategory, RunMethod

Provider renames:

  • file_system -> local (alias file_system still works)
  • FileSystemStorage -> LocalStorage
  • GCPStorage -> GCSStorage

Pre-Annotation Actions

Old (synapse-sdk v1):

from synapse_sdk.plugins.categories.pre_annotation.actions.to_task import ToTaskAction

class AnnotationToTask:
    def convert_data_from_file(...):
        ...

    def convert_data_from_inference(...):
        ...

action = ToTaskAction(run=run_instance, params=params)
result = action.start()

New (synapse-sdk v2):

from synapse_sdk.plugins.actions.to_task import ToTaskAction

class ToTask(ToTaskAction):
    action_name = 'to_task'

    def convert_data_from_file(...):
        ...

    def convert_data_from_inference(...):
        ...

Update config.yaml to use to_task and point the entrypoint to your ToTask subclass.

Dataset Converters

Old (synapse-sdk v1):

from synapse_sdk.utils.converters import get_converter, FromDMToYOLOConverter

New (synapse-sdk v2):

from synapse_sdk.utils.converters import get_converter, FromDMToYOLOConverter

# Factory function for all format conversions
converter = get_converter('dm_v2', 'yolo', root_dir='/data/dm_dataset', is_categorized=True)
converter.convert()
converter.save_to_folder('/data/yolo_output')

# Supported format pairs:
# - DM (v1/v2) ↔ YOLO
# - DM (v1/v2) ↔ COCO
# - DM (v1/v2) ↔ Pascal VOC

Breaking change: Direct imports from synapse_sdk.utils.converters no longer work. Use synapse_sdk.utils.converters instead. For backward compatibility, re-exports are available through synapse_sdk.plugins.datasets.

API changes:

  • Parameter is_categorized_dataset renamed to is_categorized
  • root_dir is now a Path object (but str still accepted)
  • Added DatasetFormat enum for type-safe format specification

Example: Convert DM v2 to YOLO with splits:

from synapse_sdk.utils.converters import get_converter

converter = get_converter(
    source='dm_v2',
    target='yolo',
    root_dir='/data/dm_dataset',
    is_categorized=True,  # has train/valid/test splits
)

# Perform conversion
result = converter.convert()

# Save to output directory
converter.save_to_folder('/data/yolo_output')

Example: Convert YOLO to DM v2:

converter = get_converter(
    source='yolo',
    target='dm_v2',
    root_dir='/data/yolo_dataset',
    is_categorized=False,
)
converter.convert()
converter.save_to_folder('/data/dm_output')

API Reference

get_plugin_actions

Extract action names from plugin configuration.

from synapse_sdk.plugins.utils import get_plugin_actions

# From dict
actions = get_plugin_actions({'actions': {'train': {}, 'export': {}}})
# Returns: ['train', 'export']

# From PluginConfig
actions = get_plugin_actions(plugin_config)

# From path
actions = get_plugin_actions('/path/to/plugin')  # reads config.yaml

get_action_method

Get the execution method (job/task/serve_application) for an action.

from synapse_sdk.plugins.utils import get_action_method
from synapse_sdk.plugins.enums import RunMethod

method = get_action_method(config, 'train')
if method == RunMethod.JOB:
    # Create job record, run async
    pass
elif method == RunMethod.TASK:
    # Run as Ray task
    pass

get_action_config

Get full configuration for a specific action.

from synapse_sdk.plugins.utils import get_action_config

config = get_action_config(plugin_config, 'train')
# Returns: {'name': 'train', 'method': 'job', 'entrypoint': '...', ...}

read_requirements

Parse a requirements.txt file.

from synapse_sdk.plugins.utils import read_requirements

reqs = read_requirements('/path/to/requirements.txt')
# Returns: ['numpy>=1.20', 'torch>=2.0'] or None if file doesn't exist

run_plugin

Execute plugin actions with automatic discovery.

from synapse_sdk.plugins.runner import run_plugin

# Auto-discover from Python module path
result = run_plugin('plugins.yolov8', 'train', {'epochs': 10})

# Auto-discover from config.yaml path
result = run_plugin('/path/to/plugin', 'train', {'epochs': 10})

# Execution modes
result = run_plugin('plugin', 'train', params, mode='local')  # Current process (default)
result = run_plugin('plugin', 'train', params, mode='task')   # Ray Actor (fast startup)
job_id = run_plugin('plugin', 'train', params, mode='job')    # Ray Job API (async)

# Explicit action class (skips discovery)
result = run_plugin('yolov8', 'train', {'epochs': 10}, action_cls=TrainAction)

Option 1: Define actions with @action decorator (recommended for Python modules):

# plugins/yolov8.py
from synapse_sdk.plugins.decorators import action
from pydantic import BaseModel

class TrainParams(BaseModel):
    epochs: int = 10
    batch_size: int = 32

@action(name='train', description='Train YOLOv8 model', params=TrainParams)
def train(params: TrainParams, ctx):
    # Training logic here
    return {'accuracy': 0.95}

@action(name='infer')
def infer(params, ctx):
    # Inference logic
    return {'predictions': [...]}

# Run it:
# run_plugin('plugins.yolov8', 'train', {'epochs': 20})

Option 2: Define actions with BaseAction class:

# plugins/yolov8.py
from synapse_sdk.plugins.action import BaseAction
from pydantic import BaseModel

class TrainParams(BaseModel):
    epochs: int = 10

class TrainAction(BaseAction[TrainParams]):
    action_name = 'train'
    params_model = TrainParams

    def execute(self):
        # self.params contains validated TrainParams
        # self.ctx contains RuntimeContext (logger, env, job_id)
        return {'accuracy': 0.95}

# Run it:
# run_plugin('plugins.yolov8', 'train', {'epochs': 20})

Option 3: Define actions with config.yaml (recommended for packaged plugins):

# plugin/config.yaml
name: YOLOv8 Plugin
code: yolov8
version: 1.0.0
category: neural_net
description: YOLOv8 object detection plugin

actions:
  train:
    entrypoint: plugin.train.TrainAction   # or plugin.train:TrainAction
    method: job
    description: Train YOLOv8 model

  infer:
    entrypoint: plugin.inference.InferAction
    method: task
    description: Run inference

  export:
    entrypoint: plugin.export.export_model
    method: task
# Run from config path:
run_plugin('/path/to/plugin', 'train', {'epochs': 20})

Entrypoint formats:

  • Dot notation: plugin.train.TrainAction (module.submodule.ClassName)
  • Colon notation: plugin.train:TrainAction (module.submodule:ClassName)

PluginDiscovery

Comprehensive plugin introspection.

from synapse_sdk.plugins.discovery import PluginDiscovery

# Load from config.yaml
discovery = PluginDiscovery.from_path('/path/to/plugin')

# Or introspect a Python module
discovery = PluginDiscovery.from_module(my_module)

# Available methods
discovery.list_actions()           # ['train', 'export']
discovery.has_action('train')      # True
discovery.get_action_method('train')  # RunMethod.JOB
discovery.get_action_config('train')  # ActionConfig instance
discovery.get_action_class('train')   # Loads class from entrypoint

Storage

Storage utilities for working with different storage backends.

Installation for cloud providers:

pip install synapse-sdk[all]    # Includes S3, GCS, SFTP support + Ray

Available providers:

  • local / file_system - Local filesystem
  • s3 / amazon_s3 / minio - S3-compatible storage
  • gcs / gs / gcp - Google Cloud Storage
  • sftp - SFTP servers
  • http / https - HTTP file servers

Basic usage:

from synapse_sdk.utils.storage import (
    get_storage,
    get_pathlib,
    get_path_file_count,
    get_path_total_size,
)

# Get storage instance
storage = get_storage({
    'provider': 'local',
    'configuration': {'location': '/data'}
})

# Upload a file
url = storage.upload(Path('/tmp/file.txt'), 'uploads/file.txt')

# Check existence
exists = storage.exists('uploads/file.txt')

# Get pathlib object for path operations
path = get_pathlib(config, '/uploads')
for file in path.rglob('*.txt'):
    print(file)

# Get file count and total size
count = get_path_file_count(config, '/uploads')
size = get_path_total_size(config, '/uploads')

Provider configurations:

# Local filesystem
{'provider': 'local', 'configuration': {'location': '/data'}}

# S3/MinIO
{'provider': 's3', 'configuration': {
    'bucket_name': 'my-bucket',
    'access_key': 'AKIAIOSFODNN7EXAMPLE',
    'secret_key': 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
    'region_name': 'us-east-1',
    'endpoint_url': 'http://minio:9000',  # optional, for MinIO
}}

# Google Cloud Storage
{'provider': 'gcs', 'configuration': {
    'bucket_name': 'my-bucket',
    'credentials': '/path/to/service-account.json',
}}

# SFTP
{'provider': 'sftp', 'configuration': {
    'host': 'sftp.example.com',
    'username': 'user',
    'password': 'secret',  # or 'private_key': '/path/to/id_rsa'
    'root_path': '/data',
}}

# HTTP
{'provider': 'http', 'configuration': {
    'base_url': 'https://files.example.com/uploads/',
    'timeout': 60,
}}

Changes from v1

Breaking Changes

These changes require code updates when migrating from v1:

v1 v2 Migration
get_action_class(category, action) get_action_method(config, action) Pass config dict instead of category string
action_class.method get_action_method(config, action) Method is now read from config, not class attribute
@register_action decorator Removed Define actions in config.yaml or use PluginDiscovery.from_module()
_REGISTERED_ACTIONS global Removed Use PluginDiscovery for action introspection
get_storage('s3://...') URL strings Dict-only config Use get_storage({'provider': 's3', 'configuration': {...}})
from ... import FileSystemStorage from ... import LocalStorage Class renamed
from ... import GCPStorage from ... import GCSStorage Class renamed
Subclassing BaseStorage ABC Implement StorageProtocol Use structural typing (duck typing) instead of inheritance

Non-Breaking Changes

These changes are backwards compatible - existing code continues to work:

Feature Notes
Provider alias file_system Still works, maps to LocalStorage
Provider aliases gcp, gs Still work, map to GCSStorage
get_plugin_actions() Same API
read_requirements() Same API
get_pathlib() Same API
get_path_file_count() Same API
get_path_total_size() Same API

New Features in v2

Feature Description
PluginDiscovery Discover actions from config files or Python modules
PluginDiscovery.from_module() Auto-discover @action decorators and BaseAction subclasses
StorageProtocol Protocol-based interface for custom storage implementations
HTTPStorage provider New provider for HTTP file servers
Plugin Upload utilities archive_and_upload(), build_and_upload(), download_and_upload()
File utilities calculate_checksum(), create_archive(), create_archive_from_git()
AsyncAgentClient Async client with WebSocket/HTTP streaming for job logs
tail_job_logs() Stream job logs with protocol auto-selection
BaseTrainAction Training base class with dataset/model helpers
BaseExportAction Export base class with filtered results helper
BaseUploadAction Upload base class with step-based workflow and rollback
i18n module Internationalization support for log messages
CLI --lang option Language selection for synapse plugin run command

Internationalization (i18n)

Log messages can be displayed in multiple languages. Currently supported: English (en) and Korean (ko).

CLI Usage

# Run with Korean log messages
synapse plugin run train --lang=ko --params '{"epochs": 10}'

# Short form
synapse plugin run train -l ko

# Works with all execution modes
synapse plugin run train --mode local --lang=ko
synapse plugin run train --mode task --lang=ko
synapse plugin run train --mode job --lang=ko

Programmatic Usage

Using Executors:

from synapse_sdk.plugins.executors.local import LocalExecutor
from synapse_sdk.plugins.executors.ray.task import RayActorExecutor
from synapse_sdk.plugins.executors.ray.jobs_api import RayJobsApiExecutor

# LocalExecutor with Korean
executor = LocalExecutor(env={'DEBUG': 'true'}, language='ko')
result = executor.execute(TrainAction, {'epochs': 10})

# RayActorExecutor with Korean
executor = RayActorExecutor(
    working_dir='/path/to/plugin',
    num_gpus=1,
    language='ko',
)

# RayJobsApiExecutor with Korean
executor = RayJobsApiExecutor(
    dashboard_address='http://localhost:8265',
    working_dir='/path/to/plugin',
    language='ko',
)

Custom i18n Messages in Plugins:

Plugin developers can provide multi-language messages using LocalizedMessage or dict format:

from synapse_sdk.i18n import LocalizedMessage

# Using LocalizedMessage
msg = LocalizedMessage({
    'en': 'Processing {count} files',
    'ko': '{count}개의 파일을 처리 중',
})

# Log with i18n support
self.ctx.log_message(
    LogMessageCode.CUSTOM_MESSAGE,
    message={'en': 'Custom message', 'ko': '사용자 정의 메시지'},
    level=LogLevel.INFO,
)

Action Base Classes

Category-specific base classes that provide helper methods and progress tracking for common workflows.

BaseTrainAction

For training workflows with dataset/model helpers.

from synapse_sdk.plugins import BaseTrainAction
from pydantic import BaseModel

class TrainParams(BaseModel):
    dataset: int
    epochs: int = 10

class MyTrainAction(BaseTrainAction[TrainParams]):
    action_name = 'train'
    params_model = TrainParams

    def execute(self) -> dict:
        # Helper methods use self.client (from RuntimeContext)
        dataset = self.get_dataset()  # Uses params.dataset
        self.set_progress(1, 3, self.progress.DATASET)

        model_path = self._train(dataset)
        self.set_progress(2, 3, self.progress.TRAIN)

        model = self.create_model(model_path, name='my-model')
        self.set_progress(3, 3, self.progress.MODEL_UPLOAD)

        return {'model_id': model['id']}

Progress categories: DATASET, TRAIN, MODEL_UPLOAD

Helper methods:

  • get_dataset() - Fetch dataset using params.dataset
  • create_model(path, **kwargs) - Upload trained model
  • get_model(model_id) - Retrieve existing model

BaseExportAction

For export workflows with filtered data retrieval.

from typing import Any

from synapse_sdk.plugins import BaseExportAction
from pydantic import BaseModel

class ExportParams(BaseModel):
    filter: dict
    output_path: str

class MyExportAction(BaseExportAction[ExportParams]):
    action_name = 'export'
    params_model = ExportParams

    def get_filtered_results(self, filters: dict) -> tuple[Any, int]:
        # Override for your target type
        return self.client.get_assignments(filters)

    def execute(self) -> dict:
        results, count = self.get_filtered_results(self.params.filter)
        self.set_progress(0, count, self.progress.DATASET_CONVERSION)

        for i, item in enumerate(results, 1):
            # Process and export item
            self.set_progress(i, count, self.progress.DATASET_CONVERSION)

        return {'exported': count}

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

synapse_sdk-2026.1.33.tar.gz (64.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

synapse_sdk-2026.1.33-py3-none-any.whl (609.6 kB view details)

Uploaded Python 3

File details

Details for the file synapse_sdk-2026.1.33.tar.gz.

File metadata

  • Download URL: synapse_sdk-2026.1.33.tar.gz
  • Upload date:
  • Size: 64.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.0 {"installer":{"name":"uv","version":"0.10.0","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for synapse_sdk-2026.1.33.tar.gz
Algorithm Hash digest
SHA256 332f681a0b679ec1b89379a5c7e38ab5e12ce33707fef8b276284eca55f7ca02
MD5 d9af885d6413fc374bc48ddd0357eb4a
BLAKE2b-256 dd84bfb6f79f3952aaef1e718f4496659a4ddf3884e254801330e35ecd04034c

See more details on using hashes here.

File details

Details for the file synapse_sdk-2026.1.33-py3-none-any.whl.

File metadata

  • Download URL: synapse_sdk-2026.1.33-py3-none-any.whl
  • Upload date:
  • Size: 609.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.0 {"installer":{"name":"uv","version":"0.10.0","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for synapse_sdk-2026.1.33-py3-none-any.whl
Algorithm Hash digest
SHA256 24b2aefbe9ee1cbacc0e14fb864369dbc2723ce497d931c0802885c57881852e
MD5 c12779d38c9aad75fdee1d80ffa67d09
BLAKE2b-256 7a66ba47bf929423b101bed804a4c743844df69c7212f4b91dd3fd26993e9149

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page