Skip to main content

A Python client for the Axionic API.

Project description

Mechanex

Mechanex is a runtime control layer for small models.

The core promise is: improve model behavior at inference time through policies, without retraining.

Product Direction

Mechanex is built around a policy-first workflow:

  1. Choose a model (local, self-hosted, or hosted).
  2. Choose a task profile.
  3. Choose an objective.
  4. Apply runtime controls (sampling, steering, constraints, verifiers, optimization).
  5. Compare and evaluate policies.
  6. Deploy and iterate.

Core Concepts

Policy

A policy is the reusable runtime object. It defines:

  • Sampling method and search settings.
  • Steering settings.
  • Output constraints.
  • Verifier stack.
  • Optimization and fallback behavior.

Execution Modes

Mechanex supports hybrid execution:

  • auto: remote when authenticated, local when not authenticated.
  • remote: force hosted inference (account + credits required).
  • local: force local model execution.

Hosted Remote Model Catalog

When using hosted execution (mx.set_execution_mode("remote")), you can select a model with:

import mechanex as mx
mx.set_model("qwen3-0.6b")

Supported hosted models:

Family Models
Gemma 2 gemma-2-27b, gemma-2-2b, gemma-2-9b, gemma-2-9b-it, gemma-2b, gemma-2b-it
Gemma 3 gemma-3-12b-it, gemma-3-12b-pt, gemma-3-1b-it, gemma-3-1b-pt, gemma-3-270m, gemma-3-270m-it, gemma-3-27b-it, gemma-3-27b-pt, gemma-3-4b-it, gemma-3-4b-pt
Llama llama-3.1-8b, llama-3.1-8b-instruct, llama-3.3-70b-instruct, meta-llama-3-8b-instruct
Qwen qwen2.5-7b-instruct, qwen3-0.6b, qwen3-1.7b, qwen3-14b, qwen3-4b, qwen3-8b
Other deepseek-r1-distill-llama-8b, gpt-oss-20b, gpt2-small, mistral-7b, pythia-70m-deduped

Installation

pip install mechanex

Quickstart

Remote Runtime Policy

import mechanex as mx

mx.set_key("your-api-key", persist=True)
mx.set_execution_mode("remote")

schema = {
    "type": "object",
    "required": ["answer", "confidence"],
    "properties": {
        "answer": {"type": "string"},
        "confidence": {"type": "number"}
    }
}

out = mx.generation.generate(
    prompt="Return JSON: answer + confidence for 'Paris is in France'.",
    sampling_method="guided-generation",
    json_schema=schema,
    max_tokens=120,
)
print(out)

Local Runtime Policy

import mechanex as mx

mx.load("gpt2-small")
mx.set_execution_mode("local")

policy = mx.policy.strict_json_extraction(
    schema={
        "type": "object",
        "required": ["summary"],
        "properties": {"summary": {"type": "string"}},
    },
    name="strict_json_small_v1",
)

res = mx.policy.run(
    prompt="Summarize speculative decoding in one sentence.",
    policy=policy,
    include_trace=True,
)
print(res["output"])

Runtime Controls

Sampling Methods

Supported methods:

  • greedy
  • top-k
  • top-p
  • min-p
  • typical
  • ads (Adaptive Determinantal Sampling)
  • constrained-beam-search
  • speculative-decoding
  • ssd
  • guided-generation
  • ensemble-sampling

Steering Modes

Supported vector-generation modes:

  • caa
  • few-shot

Constraints and Verifiers

Mechanex policies support:

  • JSON mode + JSON schema constraints.
  • Regex and grammar constraints.
  • Required fields and forbidden terms.
  • Verifier pipelines including syntax/schema checks.

Local vs Remote Capability Notes

  • ADS is remote-only.
  • Steering perceptrons are remote-only.
  • Other runtime policy methods are available for local usage with capability-aware fallback behavior where needed.

Policy API

Save, Run, Compare, Evaluate

import mechanex as mx

policy = mx.policy.fast_tool_router()
pid = mx.policy.save(policy)

single = mx.policy.run(
    prompt="Route this request to the correct tool and return JSON.",
    policy_id=pid,
    include_trace=True,
)

cmp = mx.policy.compare(
    prompt="Extract order_id and status from text.",
    policies=[mx.policy.fast_tool_router(), mx.policy.strict_json_extraction({
        "type": "object",
        "required": ["order_id", "status"],
        "properties": {"order_id": {"type": "string"}, "status": {"type": "string"}},
    })],
)

ev = mx.policy.evaluate(
    prompts=[
        "Extract {name, role} from: Alice is CTO.",
        "Extract {name, role} from: Bob is PM.",
    ],
    policy_id=pid,
)

OpenAI-Compatible Serving

Mechanex can run a local OpenAI-compatible server:

import mechanex as mx
mx.load("gpt2-small")
mx.set_execution_mode("local")
mx.serve(port=8001)

Then use any OpenAI-compatible client against http://localhost:8001/v1. policy / policy_id, steering fields, and behavior fields can be passed in request bodies.

SDK Configuration

  • Default backend URL is hosted Axionic backend.
  • Override backend URL with:
    • constructor: Mechanex(base_url="...")
    • env var: MECHANEX_BASE_URL

CLI

Account and key lifecycle:

  • mechanex signup
  • mechanex login
  • mechanex whoami
  • mechanex create-api-key
  • mechanex list-api-keys
  • mechanex balance
  • mechanex topup
  • mechanex logout

Examples

See examples/README.md for runnable workflows, including:

  • local-first runtime control
  • remote runtime control
  • sampling strategy sweep
  • strict JSON policies
  • policy compare/evaluate
  • local steering vectors
  • OpenAI-compatible serving

Engineering Docs

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mechanex-1.0.0.tar.gz (58.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mechanex-1.0.0-py3-none-any.whl (57.4 kB view details)

Uploaded Python 3

File details

Details for the file mechanex-1.0.0.tar.gz.

File metadata

  • Download URL: mechanex-1.0.0.tar.gz
  • Upload date:
  • Size: 58.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for mechanex-1.0.0.tar.gz
Algorithm Hash digest
SHA256 859f6d73f40c6d315f4ed4ac6d5c83217cc4cf65ce4b0ae1dafadf1b5b04ce4b
MD5 24ab38d84642beb9f09bdac834409a7c
BLAKE2b-256 b7412a761d85cdb2f2d398ac84e66ae7d2d2101b13aa0d0f05e1dbeee855a1ea

See more details on using hashes here.

File details

Details for the file mechanex-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: mechanex-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 57.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for mechanex-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 52549c85ac308bcfb41bcb2c956e5499dcce170bfef0aeb75dee46aad4122788
MD5 0257e8623c3d6bcf1d301cfa43c292f8
BLAKE2b-256 3a00c3a439baa6fbb179054d3782853f585d9e8e26393f2b5023b676640e2cd0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page