Skip to main content

synapse sdk

Project description

Synapse SDK v2

To be merged into synapse-sdk after development

Table of Contents


Installation

pip install synapse-sdk

Migration Guide

Plugin Utils

Old (synapse-sdk v1):

from synapse_sdk.plugins.utils import get_action_class, get_plugin_actions, read_requirements

# Get run method by loading the action class
action_method = get_action_class(config['category'], action).method

New (synapse-sdk v2):

from synapse_sdk.plugins.utils import get_action_method, get_plugin_actions, read_requirements

# Get run method directly from config (no class loading needed)
action_method = get_action_method(config, action)

Plugin Types

Old:

from synapse_sdk.plugins.enums import PluginCategory
from synapse_sdk.plugins.base import RunMethod

New:

from synapse_sdk.plugins.enums import PluginCategory, RunMethod

Provider renames:

  • file_system -> local (alias file_system still works)
  • FileSystemStorage -> LocalStorage
  • GCPStorage -> GCSStorage

Pre-Annotation Actions

Old (synapse-sdk v1):

from synapse_sdk.plugins.categories.pre_annotation.actions.to_task import ToTaskAction

class AnnotationToTask:
    def convert_data_from_file(...):
        ...

    def convert_data_from_inference(...):
        ...

action = ToTaskAction(run=run_instance, params=params)
result = action.start()

New (synapse-sdk v2):

from synapse_sdk.plugins.actions.add_task_data import AddTaskDataAction

class AddTaskData(AddTaskDataAction):
    action_name = 'add_task_data'

    def convert_data_from_file(...):
        ...

    def convert_data_from_inference(...):
        ...

Update config.yaml to use add_task_data and point the entrypoint to your AddTaskData subclass.


API Reference

get_plugin_actions

Extract action names from plugin configuration.

from synapse_sdk.plugins.utils import get_plugin_actions

# From dict
actions = get_plugin_actions({'actions': {'train': {}, 'export': {}}})
# Returns: ['train', 'export']

# From PluginConfig
actions = get_plugin_actions(plugin_config)

# From path
actions = get_plugin_actions('/path/to/plugin')  # reads config.yaml

get_action_method

Get the execution method (job/task/serve_application) for an action.

from synapse_sdk.plugins.utils import get_action_method
from synapse_sdk.plugins.enums import RunMethod

method = get_action_method(config, 'train')
if method == RunMethod.JOB:
    # Create job record, run async
    pass
elif method == RunMethod.TASK:
    # Run as Ray task
    pass

get_action_config

Get full configuration for a specific action.

from synapse_sdk.plugins.utils import get_action_config

config = get_action_config(plugin_config, 'train')
# Returns: {'name': 'train', 'method': 'job', 'entrypoint': '...', ...}

read_requirements

Parse a requirements.txt file.

from synapse_sdk.plugins.utils import read_requirements

reqs = read_requirements('/path/to/requirements.txt')
# Returns: ['numpy>=1.20', 'torch>=2.0'] or None if file doesn't exist

run_plugin

Execute plugin actions with automatic discovery.

from synapse_sdk.plugins.runner import run_plugin

# Auto-discover from Python module path
result = run_plugin('plugins.yolov8', 'train', {'epochs': 10})

# Auto-discover from config.yaml path
result = run_plugin('/path/to/plugin', 'train', {'epochs': 10})

# Execution modes
result = run_plugin('plugin', 'train', params, mode='local')  # Current process (default)
result = run_plugin('plugin', 'train', params, mode='task')   # Ray Actor (fast startup)
job_id = run_plugin('plugin', 'train', params, mode='job')    # Ray Job API (async)

# Explicit action class (skips discovery)
result = run_plugin('yolov8', 'train', {'epochs': 10}, action_cls=TrainAction)

Option 1: Define actions with @action decorator (recommended for Python modules):

# plugins/yolov8.py
from synapse_sdk.plugins.decorators import action
from pydantic import BaseModel

class TrainParams(BaseModel):
    epochs: int = 10
    batch_size: int = 32

@action(name='train', description='Train YOLOv8 model', params=TrainParams)
def train(params: TrainParams, ctx):
    # Training logic here
    return {'accuracy': 0.95}

@action(name='infer')
def infer(params, ctx):
    # Inference logic
    return {'predictions': [...]}

# Run it:
# run_plugin('plugins.yolov8', 'train', {'epochs': 20})

Option 2: Define actions with BaseAction class:

# plugins/yolov8.py
from synapse_sdk.plugins.action import BaseAction
from pydantic import BaseModel

class TrainParams(BaseModel):
    epochs: int = 10

class TrainAction(BaseAction[TrainParams]):
    action_name = 'train'
    params_model = TrainParams

    def execute(self):
        # self.params contains validated TrainParams
        # self.ctx contains RuntimeContext (logger, env, job_id)
        return {'accuracy': 0.95}

# Run it:
# run_plugin('plugins.yolov8', 'train', {'epochs': 20})

Option 3: Define actions with config.yaml (recommended for packaged plugins):

# plugin/config.yaml
name: YOLOv8 Plugin
code: yolov8
version: 1.0.0
category: neural_net
description: YOLOv8 object detection plugin

actions:
  train:
    entrypoint: plugin.train.TrainAction   # or plugin.train:TrainAction
    method: job
    description: Train YOLOv8 model

  infer:
    entrypoint: plugin.inference.InferAction
    method: task
    description: Run inference

  export:
    entrypoint: plugin.export.export_model
    method: task
# Run from config path:
run_plugin('/path/to/plugin', 'train', {'epochs': 20})

Entrypoint formats:

  • Dot notation: plugin.train.TrainAction (module.submodule.ClassName)
  • Colon notation: plugin.train:TrainAction (module.submodule:ClassName)

PluginDiscovery

Comprehensive plugin introspection.

from synapse_sdk.plugins.discovery import PluginDiscovery

# Load from config.yaml
discovery = PluginDiscovery.from_path('/path/to/plugin')

# Or introspect a Python module
discovery = PluginDiscovery.from_module(my_module)

# Available methods
discovery.list_actions()           # ['train', 'export']
discovery.has_action('train')      # True
discovery.get_action_method('train')  # RunMethod.JOB
discovery.get_action_config('train')  # ActionConfig instance
discovery.get_action_class('train')   # Loads class from entrypoint

Storage

Storage utilities for working with different storage backends.

Installation for cloud providers:

pip install synapse-sdk[all]    # Includes S3, GCS, SFTP support + Ray

Available providers:

  • local / file_system - Local filesystem
  • s3 / amazon_s3 / minio - S3-compatible storage
  • gcs / gs / gcp - Google Cloud Storage
  • sftp - SFTP servers
  • http / https - HTTP file servers

Basic usage:

from synapse_sdk.utils.storage import (
    get_storage,
    get_pathlib,
    get_path_file_count,
    get_path_total_size,
)

# Get storage instance
storage = get_storage({
    'provider': 'local',
    'configuration': {'location': '/data'}
})

# Upload a file
url = storage.upload(Path('/tmp/file.txt'), 'uploads/file.txt')

# Check existence
exists = storage.exists('uploads/file.txt')

# Get pathlib object for path operations
path = get_pathlib(config, '/uploads')
for file in path.rglob('*.txt'):
    print(file)

# Get file count and total size
count = get_path_file_count(config, '/uploads')
size = get_path_total_size(config, '/uploads')

Provider configurations:

# Local filesystem
{'provider': 'local', 'configuration': {'location': '/data'}}

# S3/MinIO
{'provider': 's3', 'configuration': {
    'bucket_name': 'my-bucket',
    'access_key': 'AKIAIOSFODNN7EXAMPLE',
    'secret_key': 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
    'region_name': 'us-east-1',
    'endpoint_url': 'http://minio:9000',  # optional, for MinIO
}}

# Google Cloud Storage
{'provider': 'gcs', 'configuration': {
    'bucket_name': 'my-bucket',
    'credentials': '/path/to/service-account.json',
}}

# SFTP
{'provider': 'sftp', 'configuration': {
    'host': 'sftp.example.com',
    'username': 'user',
    'password': 'secret',  # or 'private_key': '/path/to/id_rsa'
    'root_path': '/data',
}}

# HTTP
{'provider': 'http', 'configuration': {
    'base_url': 'https://files.example.com/uploads/',
    'timeout': 60,
}}

Changes from v1

Breaking Changes

These changes require code updates when migrating from v1:

v1 v2 Migration
get_action_class(category, action) get_action_method(config, action) Pass config dict instead of category string
action_class.method get_action_method(config, action) Method is now read from config, not class attribute
@register_action decorator Removed Define actions in config.yaml or use PluginDiscovery.from_module()
_REGISTERED_ACTIONS global Removed Use PluginDiscovery for action introspection
get_storage('s3://...') URL strings Dict-only config Use get_storage({'provider': 's3', 'configuration': {...}})
from ... import FileSystemStorage from ... import LocalStorage Class renamed
from ... import GCPStorage from ... import GCSStorage Class renamed
Subclassing BaseStorage ABC Implement StorageProtocol Use structural typing (duck typing) instead of inheritance

Non-Breaking Changes

These changes are backwards compatible - existing code continues to work:

Feature Notes
Provider alias file_system Still works, maps to LocalStorage
Provider aliases gcp, gs Still work, map to GCSStorage
get_plugin_actions() Same API
read_requirements() Same API
get_pathlib() Same API
get_path_file_count() Same API
get_path_total_size() Same API

New Features in v2

Feature Description
PluginDiscovery Discover actions from config files or Python modules
PluginDiscovery.from_module() Auto-discover @action decorators and BaseAction subclasses
StorageProtocol Protocol-based interface for custom storage implementations
HTTPStorage provider New provider for HTTP file servers
Plugin Upload utilities archive_and_upload(), build_and_upload(), download_and_upload()
File utilities calculate_checksum(), create_archive(), create_archive_from_git()
AsyncAgentClient Async client with WebSocket/HTTP streaming for job logs
tail_job_logs() Stream job logs with protocol auto-selection
BaseTrainAction Training base class with dataset/model helpers
BaseExportAction Export base class with filtered results helper
BaseUploadAction Upload base class with step-based workflow and rollback

Action Base Classes

Category-specific base classes that provide helper methods and progress tracking for common workflows.

BaseTrainAction

For training workflows with dataset/model helpers.

from synapse_sdk.plugins import BaseTrainAction, TrainProgressCategories
from pydantic import BaseModel

class TrainParams(BaseModel):
    dataset_id: int
    epochs: int = 10

class MyTrainAction(BaseTrainAction[TrainParams]):
    action_name = 'train'
    params_model = TrainParams

    def execute(self) -> dict:
        # Helper methods use self.client (from RuntimeContext)
        dataset = self.get_dataset()  # Uses params.dataset_id
        self.set_progress(1, 3, self.progress.DATASET)

        model_path = self._train(dataset)
        self.set_progress(2, 3, self.progress.TRAIN)

        model = self.create_model(model_path, name='my-model')
        self.set_progress(3, 3, self.progress.MODEL_UPLOAD)

        return {'model_id': model['id']}

Progress categories: DATASET, TRAIN, MODEL_UPLOAD

Helper methods:

  • get_dataset() - Fetch dataset using params.dataset_id
  • create_model(path, **kwargs) - Upload trained model
  • get_model(model_id) - Retrieve existing model

BaseExportAction

For export workflows with filtered data retrieval.

from typing import Any

from synapse_sdk.plugins import BaseExportAction, ExportProgressCategories
from pydantic import BaseModel

class ExportParams(BaseModel):
    filter: dict
    output_path: str

class MyExportAction(BaseExportAction[ExportParams]):
    action_name = 'export'
    params_model = ExportParams

    def get_filtered_results(self, filters: dict) -> tuple[Any, int]:
        # Override for your target type
        return self.client.get_assignments(filters)

    def execute(self) -> dict:
        results, count = self.get_filtered_results(self.params.filter)
        self.set_progress(0, count, self.progress.DATASET_CONVERSION)

        for i, item in enumerate(results, 1):
            # Process and export item
            self.set_progress(i, count, self.progress.DATASET_CONVERSION)

        return {'exported': count}

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

synapse_sdk-2026.1.5.tar.gz (1.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

synapse_sdk-2026.1.5-py3-none-any.whl (345.0 kB view details)

Uploaded Python 3

File details

Details for the file synapse_sdk-2026.1.5.tar.gz.

File metadata

  • Download URL: synapse_sdk-2026.1.5.tar.gz
  • Upload date:
  • Size: 1.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.26 {"installer":{"name":"uv","version":"0.9.26","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for synapse_sdk-2026.1.5.tar.gz
Algorithm Hash digest
SHA256 581e26d8871965eabf8884bf0c28d3205b8ce470adfcae61766e4f6533f62c68
MD5 56d9ee918461b9e493310cf16979b60a
BLAKE2b-256 305b7d68535854fe35cf3872ebc99c733da20fb040f260b7aa73a01f8d888e78

See more details on using hashes here.

File details

Details for the file synapse_sdk-2026.1.5-py3-none-any.whl.

File metadata

  • Download URL: synapse_sdk-2026.1.5-py3-none-any.whl
  • Upload date:
  • Size: 345.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.26 {"installer":{"name":"uv","version":"0.9.26","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for synapse_sdk-2026.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 22d4ee628d80e566ae03647acd6797461cbb4c324088dbdb5b811a8910600d33
MD5 07591ffdd0da38427bc78b5e1536a488
BLAKE2b-256 3562c5180992d7d795fcb7b2a85731091c8bed67eb59a5fe6cc4fda90c0e3e5e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page