A Python client for the Axionic API.
Project description
Mechanex
Mechanex allows you to control and debug your LLMs. Learn more at axioniclabs.ai
Installation
pip install mechanex
Quick Start
1. Initialize the Client
You must have an API key to use mechanex. The CLI helps you manage this easily.
New Users: Run the signup command to create an account, automatically log in, and generate your first API key:
mechanex signup
Existing Users: If you already have an account, log in and generate a key manually:
mechanex login
mechanex create-api-key
Using the Key: The Python client automatically loads the key generated by the CLI.
import mechanex as mx
# The client automatically loads the key from ~/.mechanex/config.json. You can alternatively set your key manually. If persist is True, the key will be saved to the config file.
mx.set_key("your-api-key-here", persist=False)
2. Generation and Sampling
You can control generation using various sampling methods:
output = mx.generation.generate(
prompt="The future of AI is",
max_tokens=64,
# Primary sampling method
sampling_method="top-p", # Options: "greedy", "top-k", "top-p", "ads" (requires paid API key)
# Sampling parameters
top_p=0.9,
top_k=50
)
print(output)
Local Model Management
Mechanex allows you to load models locally for inspection and low-latency hooks. The full list of supported models are as follows (this is to maintain compatibility for our automated steering pipeline):
| Family | Models |
|---|---|
| Gemma 2 | gemma-2-27b, gemma-2-2b, gemma-2-9b, gemma-2-9b-it, gemma-2b, gemma-2b-it |
| Gemma 3 | gemma-3-12b-it, gemma-3-12b-pt, gemma-3-1b-it, gemma-3-1b-pt, gemma-3-270m, gemma-3-270m-it, gemma-3-27b-it, gemma-3-27b-pt, gemma-3-4b-it, gemma-3-4b-pt |
| Llama | llama-3.1-8b, llama-3.1-8b-instruct, llama-3.3-70b-instruct, meta-llama-3-8b-instruct |
| Qwen | qwen2.5-7b-instruct, qwen3-0.6b, qwen3-1.7b, qwen3-14b, qwen3-4b, qwen3-8b |
| Other | deepseek-r1-distill-llama-8b, gpt-oss-20b, gpt2-small, mistral-7b, pythia-70m-deduped |
Loading a Local Model
import mechanex as mx
mx.set_key("your-api-key-here") # Required even for local mode
mx.load("gpt2") # Uses transformer-lens to load the model
Unloading a Model
To free up GPU memory and switch back to remote execution flow:
mx.unload()
CLI Commands
The mechanex CLI provides utilities for managing your account and keys.
mechanex signup: Register a new account.mechanex login: Authenticate and save your credentials.mechanex whoami: View your current session and profile.mechanex list-api-keys: View all your active API keys.mechanex create-api-key: Generate a new persistent API key.mechanex logout: Clear your local session credentials.
Steering Vectors
Steering vectors allow you to control the behavior of a model by injecting specific activation patterns.
Compute a Steering Vector
# Create a vector from contrastive pairs
vector_id = mx.steering.generate_vectors(
prompts=["I think that", "People say"],
positive_answers=[" kindness is key", " helping is good"],
negative_answers=[" hate is power", " hurting is fine"],
method="few-shot" # Options: caa, few-shot
)
# Apply it during generation
steered_output = mx.generation.generate(
prompt="What is your philosophy?",
steering_vector=vector_id,
steering_strength=1.5
)
SAE (Sparse Autoencoder) Pipeline
The SAE pipeline provides advanced behavioral detection and automatic correction.
1. Create a Behavior
Define a behavior to monitor and potentially correct.
behavior = mx.sae.create_behavior_from_jsonl(
behavior_name="toxicity",
dataset_path="tests/toxicity_dataset.jsonl",
description="Reduces toxic output"
)
2. Generate with SAE Steering
Utilize SAEs to detect behavioral drift and automatically apply corrections.
# Generate with auto-correction enabled
response = mx.sae.generate(
prompt="Tell me a secret",
auto_correct=True,
behavior_names=["toxicity"]
)
print(response)
Deployment & Serving
Local OpenAI-Compatible Server
Mechanex can host an OpenAI-compatible server that leverages your locally loaded model or remote API.
import mechanex as mx
mx.load("gpt2")
# Serve with optional global behavior correction
mx.serve(
port=8000,
corrected_behaviors=["anger"] # Enforces "anger" correction on all requests
)
You can then interact with it using any standard tool, like the OpenAI Python client. Mechanix supports custom parameters via extra_body for mechanistic features:
from openai import OpenAI
client = OpenAI(base_url="http://localhost:8000/v1", api_key="any")
# 1. Standard Chat Completion
completion = client.chat.completions.create(
model="mechanex",
messages=[{"role": "user", "content": "Hello!"}]
)
# 2. Steered Completion (using extra_body)
steered_completion = client.chat.completions.create(
model="mechanex",
messages=[{"role": "user", "content": "Hello!"}],
extra_body={
"steering_vector": "your-vector-id",
"steering_strength": 2.0
}
)
# 3. SAE-monitored Completion
sae_completion = client.chat.completions.create(
model="mechanex",
messages=[{"role": "user", "content": "How are you?"}],
extra_body={
"behavior_names": ["toxicity"],
"auto_correct": True
}
)
vLLM Integration
For high-performance serving, you can integrate with vLLM by passing use_vllm=True to the serve method.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mechanex-0.5.0.tar.gz.
File metadata
- Download URL: mechanex-0.5.0.tar.gz
- Upload date:
- Size: 40.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
37f30f804f7f19472c87694ac369d03e0c1d3404008b66886889889ea8cf451f
|
|
| MD5 |
d54738a429dd8f936f78a1c14daff29e
|
|
| BLAKE2b-256 |
c21f2167c8d1e6fc5773e99fed439702cc9a30f7bedbfde00b2e16dfd512e414
|
File details
Details for the file mechanex-0.5.0-py3-none-any.whl.
File metadata
- Download URL: mechanex-0.5.0-py3-none-any.whl
- Upload date:
- Size: 42.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b693143c2b912b6e5cb948567b21623b435ac2d754711e1600da325787c31064
|
|
| MD5 |
f62c815945e7b27dc1a39df1ef33089d
|
|
| BLAKE2b-256 |
548ddca3b110b3b86fc6939789fd957e366848dd06494910011b8e6c0b98d9e8
|