Skip to main content

Reusable mid-level building blocks for MLX — the missing layer between mlx.nn and full model implementations

Project description

mlx-arsenal

PyPI version CI Python License

Low-level operations and reusable building blocks missing from MLX core — the toolbox you want when porting PyTorch models to Apple Silicon.

Install

pip install mlx-arsenal

Or directly from source:

pip install git+https://github.com/dgrauet/mlx-arsenal.git

Modules

Module Components Replaces (PyTorch)
mlx_arsenal.spatial interpolate_nearest, interpolate_3d, avg_pool1d, replicate_pad, upsample_nearest/bilinear, pixel_shuffle/unshuffle, patchify/unpatchify, PatchEmbed2d/3d F.interpolate, F.avg_pool1d, F.pad(mode="replicate"), F.pixel_shuffle
mlx_arsenal.layout to_channels_last/first, channels_last ctx manager, convert_conv_weights, load_safetensors NCHW ↔ NHWC conversion, weight transposition
mlx_arsenal.conv weight_norm, WeightNorm nn.utils.weight_norm
mlx_arsenal.attention causal_mask, sliding_window_mask Attention mask creation
mlx_arsenal.norm PixelNorm, ScaleNorm Custom normalization layers
mlx_arsenal.encoding FourierEmbedder Sinusoidal positional encoding
mlx_arsenal.diffusion get_timestep_embedding, TimestepEmbedding, get_sampling_sigmas, dynamic_shift_schedule, FlowMatchEulerDiscreteScheduler, euler_step, classifier_free_guidance Flow-matching diffusion primitives
mlx_arsenal.moe MoEGate, MoELayer Top-k mixture-of-experts dispatch
mlx_arsenal.rasterize rasterize_triangles, interpolate Differentiable triangle rasterization with Metal z-buffer
mlx_arsenal.tiling tiled_process, temporal_slice_process Memory-efficient large tensor processing

Quick start

from mlx_arsenal.spatial import interpolate_nearest, avg_pool1d, replicate_pad
from mlx_arsenal.layout import to_channels_last, convert_conv_weights
from mlx_arsenal.attention import causal_mask

# Resize a video tensor (B, D, H, W, C)
x_resized = interpolate_nearest(x, size=(8, 32, 32))

# Temporal pooling
pooled = avg_pool1d(temporal_features, kernel_size=2)

# Pad with edge replication (like F.pad mode="replicate")
padded = replicate_pad(x, [(0,0), (2,0), (1,1), (1,1), (0,0)])

# Convert PyTorch conv weights to MLX channels-last layout
mlx_weights = convert_conv_weights(pytorch_weights)

# Causal attention mask for autoregressive decoding
mask = causal_mask(seq_len=128, offset=kv_cache_len)

Requirements

  • Python >= 3.10
  • MLX >= 0.27.0
  • Apple Silicon Mac

Development

pip install -e ".[dev]"
pytest tests/

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlx_arsenal-0.2.1.tar.gz (34.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mlx_arsenal-0.2.1-py3-none-any.whl (32.1 kB view details)

Uploaded Python 3

File details

Details for the file mlx_arsenal-0.2.1.tar.gz.

File metadata

  • Download URL: mlx_arsenal-0.2.1.tar.gz
  • Upload date:
  • Size: 34.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for mlx_arsenal-0.2.1.tar.gz
Algorithm Hash digest
SHA256 0ab1513087ca12f5f47ff1b68e4b0394a1f358bea67a62ad595cabd709c33e63
MD5 6b4101c00f6d4a2409d9b65ffd20a1fb
BLAKE2b-256 736822c09ae8c827da1aadde1a9d0c2c5aba652f93c26f4278c5fc829bf71319

See more details on using hashes here.

Provenance

The following attestation bundles were made for mlx_arsenal-0.2.1.tar.gz:

Publisher: release.yml on dgrauet/mlx-arsenal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mlx_arsenal-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: mlx_arsenal-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 32.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for mlx_arsenal-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 2d5d061ad4eb2647628bc4f004c509635a43ec6a6748131e1f3f2a3e033e751c
MD5 c3bec3430cbe243ac513e040b2b74bb4
BLAKE2b-256 5d3fab475cdd5baa736d74e55cde4033b60d9ffad07c96934f11fba7b060d8cd

See more details on using hashes here.

Provenance

The following attestation bundles were made for mlx_arsenal-0.2.1-py3-none-any.whl:

Publisher: release.yml on dgrauet/mlx-arsenal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page