Skip to main content

Agent inference package using ONNX models without Ray or PyTorch dependencies

Project description

Amesa Inference

A lightweight inference package for running Amesa agents using ONNX models without Ray or PyTorch dependencies.

Overview

composabl_inference provides a standalone inference engine for running trained Amesa agents. It uses ONNX Runtime for model inference, making it suitable for deployment scenarios where you want to avoid heavy dependencies like Ray and PyTorch.

Features

  • ONNX-based inference: Uses ONNX Runtime for efficient model inference
  • No Ray or PyTorch dependencies: Lightweight package suitable for production deployment
  • Network management: Supports both local and remote objects (skills, perceptors, controllers)
  • Compatible API: Similar interface to Trainer.package() for easy migration

Installation

pip install amesa-inference

Usage

Basic Inference

from composabl_inference import InferenceEngine
from composabl_core import Agent

# Create inference engine (only license needed for license validation)
engine = InferenceEngine(license="your-license-key")

# Load agent
agent = Agent.load("path/to/agent")
await engine.load_agent(agent)

# Package agent for inference (similar to Trainer.package())
await engine.package()

# Run inference
observation = {...}  # Your observation from the simulator
action = engine.execute(observation)

With Remote Objects

The inference engine supports remote skills, perceptors, and controllers, just like the Trainer:

from composabl_inference import InferenceEngine

# Optional: provide custom config for NetworkMgr (e.g., for remote targets)
config = {
    "target": {
        "local": {
            "address": "localhost:1337",
        },
    },
}

engine = InferenceEngine(license="your-license-key", config=config)
await engine.load_agent("path/to/agent")
await engine.package()

# The skill processor will automatically handle remote objects
action = engine.execute(observation)

Cleanup

# Clean up resources
await engine.close()

Architecture

Components

  1. InferenceEngine: Main entry point for inference operations
  2. NetworkMgr: Manages network connections (non-Ray version)
  3. ONNXInferenceEngine: Handles ONNX model loading and inference
  4. ONNXSkillProcessor: Processes skills using ONNX models instead of PyTorch

Differences from Trainer

  • Uses ONNX Runtime instead of PyTorch for model inference
  • NetworkMgr is not a Ray actor (runs in the same process)
  • No Ray initialization required
  • Lighter weight, suitable for production deployment

Requirements

  • Python >= 3.10
  • composabl-core
  • composabl-api
  • onnxruntime
  • numpy

License

Proprietary and confidential - Copyright (C) Amesa, Inc

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

amesa_inference-0.20.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.5 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

amesa_inference-0.20.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (2.5 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ ARM64

amesa_inference-0.20.0-cp311-cp311-macosx_11_0_arm64.whl (395.4 kB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

amesa_inference-0.20.0-cp311-cp311-macosx_10_9_x86_64.whl (400.8 kB view details)

Uploaded CPython 3.11macOS 10.9+ x86-64

amesa_inference-0.20.0-cp311-cp311-macosx_10_9_universal2.whl (793.3 kB view details)

Uploaded CPython 3.11macOS 10.9+ universal2 (ARM64, x86-64)

amesa_inference-0.20.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

amesa_inference-0.20.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (2.4 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ ARM64

amesa_inference-0.20.0-cp310-cp310-macosx_11_0_arm64.whl (398.2 kB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

amesa_inference-0.20.0-cp310-cp310-macosx_10_9_x86_64.whl (403.9 kB view details)

Uploaded CPython 3.10macOS 10.9+ x86-64

amesa_inference-0.20.0-cp310-cp310-macosx_10_9_universal2.whl (799.0 kB view details)

Uploaded CPython 3.10macOS 10.9+ universal2 (ARM64, x86-64)

File details

Details for the file amesa_inference-0.20.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for amesa_inference-0.20.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5b4e369ae208cbde323e5ee03da31d2b1d5c2eba8115f39bedcfe4882b5155c2
MD5 6bfd53721f5bba8b98d61b3ed7005431
BLAKE2b-256 bbaf8c2ddb7839ff7df9f683e9d8f8d797b121f92388f3038c9e84ac9ba262d1

See more details on using hashes here.

Provenance

The following attestation bundles were made for amesa_inference-0.20.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: build-and-publish-package.yaml on Composabl/sdk.composabl.ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file amesa_inference-0.20.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for amesa_inference-0.20.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 7017a22064eace6d4c9abd88e127bc85b785b2853874d1839219b78d72c76929
MD5 71755537c4ac507deca67c102093ec93
BLAKE2b-256 d34f2525fcb248617af006c61316fcb070e09fa8de61af352cb95790b5f8a818

See more details on using hashes here.

Provenance

The following attestation bundles were made for amesa_inference-0.20.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl:

Publisher: build-and-publish-package.yaml on Composabl/sdk.composabl.ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file amesa_inference-0.20.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for amesa_inference-0.20.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 c329f76731d71d1146420d354f88b884c86aa5fdbecf91a890246937d486ecb1
MD5 5063dfc9f0422cba29269a4e148d8d35
BLAKE2b-256 c6eac43c5d8934bc320636dc5159c1f94e2f96d97b02e38537858776e6cdadd1

See more details on using hashes here.

Provenance

The following attestation bundles were made for amesa_inference-0.20.0-cp311-cp311-macosx_11_0_arm64.whl:

Publisher: build-and-publish-package.yaml on Composabl/sdk.composabl.ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file amesa_inference-0.20.0-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for amesa_inference-0.20.0-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 ab97f154568133d8a117ce3f8b74982fa516696aaea1ea2d1d415bf7443d1a69
MD5 c149dcb4c0fe0dce73618a57af148694
BLAKE2b-256 13753716293cc18b041e7d45365574385c69c7e36c613a133ecbcc6a72791d79

See more details on using hashes here.

Provenance

The following attestation bundles were made for amesa_inference-0.20.0-cp311-cp311-macosx_10_9_x86_64.whl:

Publisher: build-and-publish-package.yaml on Composabl/sdk.composabl.ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file amesa_inference-0.20.0-cp311-cp311-macosx_10_9_universal2.whl.

File metadata

File hashes

Hashes for amesa_inference-0.20.0-cp311-cp311-macosx_10_9_universal2.whl
Algorithm Hash digest
SHA256 03efc14d84923454d4642047baf5ed2573637bb534a28aa1dee3962c12553322
MD5 06ab9422af26447ff896a1514ad9addc
BLAKE2b-256 54b464b86ce5a3700aa03015a9966e77a4dcb9d63734eb09c84619cc7b5cffc7

See more details on using hashes here.

Provenance

The following attestation bundles were made for amesa_inference-0.20.0-cp311-cp311-macosx_10_9_universal2.whl:

Publisher: build-and-publish-package.yaml on Composabl/sdk.composabl.ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file amesa_inference-0.20.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for amesa_inference-0.20.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 008c5d2e2cae492f83bba31515e53f21880ed9af773cf8538452c9d4180c4742
MD5 c62a1ee8bafad7d27e73160127d827e5
BLAKE2b-256 f806f17dce2b22ba2c5b5c4d20272219a35cf9f2b3f086f3aa0e4e483bee679f

See more details on using hashes here.

Provenance

The following attestation bundles were made for amesa_inference-0.20.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: build-and-publish-package.yaml on Composabl/sdk.composabl.ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file amesa_inference-0.20.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for amesa_inference-0.20.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 fd50b5b42cdf3a2e5ccf11c28e76ac5c8fe1ddc8226a3ad08357fb907673715d
MD5 25547030231bcf1766c67b6d90d7401b
BLAKE2b-256 2132d833eb68644a8630c5a3cf4b49f308a9d1cda6e2064ee8a3488e26a44d38

See more details on using hashes here.

Provenance

The following attestation bundles were made for amesa_inference-0.20.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl:

Publisher: build-and-publish-package.yaml on Composabl/sdk.composabl.ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file amesa_inference-0.20.0-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for amesa_inference-0.20.0-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 49134a837aff933487a8d94350f60ca1831bc79a558f1d85001ece57170b7c99
MD5 cca3a704036ee4bcf071a4826bf8b644
BLAKE2b-256 4df6dac62928c7dc61de2e6be255790abf47df66853dce72b6dd2f2a657456b4

See more details on using hashes here.

Provenance

The following attestation bundles were made for amesa_inference-0.20.0-cp310-cp310-macosx_11_0_arm64.whl:

Publisher: build-and-publish-package.yaml on Composabl/sdk.composabl.ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file amesa_inference-0.20.0-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for amesa_inference-0.20.0-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 04b380138b5a4e1b6f9db1c64ebbb9ef3211f90b6e601547bd3ee00f311a9bd1
MD5 fea071e45a3d624e78e177dbe1cbaac5
BLAKE2b-256 c91bc161e41cf5f32698240cb2b9af7df7348bec3ebe245e91244e9df83a6d9e

See more details on using hashes here.

Provenance

The following attestation bundles were made for amesa_inference-0.20.0-cp310-cp310-macosx_10_9_x86_64.whl:

Publisher: build-and-publish-package.yaml on Composabl/sdk.composabl.ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file amesa_inference-0.20.0-cp310-cp310-macosx_10_9_universal2.whl.

File metadata

File hashes

Hashes for amesa_inference-0.20.0-cp310-cp310-macosx_10_9_universal2.whl
Algorithm Hash digest
SHA256 3a1dbafb181ef9d60ab767ae90d24fbc7d69c57069d598e6a47aeb3f2314d73e
MD5 ec3fcb2e504ac9a866c4924b38a57377
BLAKE2b-256 6b581478205f9ff26068c854518e76883b65cc7df9d6270104bfa1326b607110

See more details on using hashes here.

Provenance

The following attestation bundles were made for amesa_inference-0.20.0-cp310-cp310-macosx_10_9_universal2.whl:

Publisher: build-and-publish-package.yaml on Composabl/sdk.composabl.ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page