Skip to main content

Run MLX compatible HuggingFace models with Pydantic AI locally

Project description

pydantic-ai-mlx

MLX local inference for Pydantic AI through LM Studio or mlx-lm directly.


pydantic-ai-mlx PyPI download count

Run MLX compatible HuggingFace models on Apple silicon locally with Pydantic AI.

Two options are provided as backends;

  • LM Studio backend (OpenAI compatible server that can also utilize mlx-lm, model runs on a separate background process)
  • mlx-lm backend (direct integration with Apple's library, model runs within your applicaiton, experimental support)

STILL IN DEVELOPMENT, NOT RECOMMENDED FOR PRODUCTION USE YET.

Contributions are welcome!

Features

  • LM Studio backend, should be fully supported
  • Streaming text support for mlx-lm backend
  • Tool calling support for mlx-lm backend

Apple's MLX seems more performant on Apple silicon than llama.cpp (Ollama), as of January 2025.

Installation

uv add pydantic-ai-mlx

Usage

LM Studio backend

from pydantic_ai import Agent
from pydantic_ai.messages import ModelMessage
from pydantic_ai_lm_studio import LMStudioModel

model = LMStudioModel(model_name="mlx-community/Qwen2.5-7B-Instruct-4bit") # supports tool calling
agent = Agent(model, system_prompt="You are a chatbot.")

async def stream_response(user_prompt: str, message_history: list[ModelMessage]):
    async with agent.run_stream(user_prompt, message_history) as result:
        async for message in result.stream():
            yield message

mlx-lm backend

from pydantic_ai import Agent
from pydantic_ai.messages import ModelMessage
from pydantic_ai_mlx_lm import MLXModel

model = MLXModel(model_name="mlx-community/Llama-3.2-3B-Instruct-4bit")
# See https://github.com/ml-explore/mlx-examples/blob/main/llms/README.md#supported-models
# also https://huggingface.co/mlx-community

agent = Agent(model, system_prompt="You are a chatbot.")

async def stream_response(user_prompt: str, message_history: list[ModelMessage]):
    async with agent.run_stream(user_prompt, message_history) as result:
        async for message in result.stream():
            yield message

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pydantic_ai_mlx-0.2.5.tar.gz (47.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pydantic_ai_mlx-0.2.5-py3-none-any.whl (9.5 kB view details)

Uploaded Python 3

File details

Details for the file pydantic_ai_mlx-0.2.5.tar.gz.

File metadata

  • Download URL: pydantic_ai_mlx-0.2.5.tar.gz
  • Upload date:
  • Size: 47.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.5

File hashes

Hashes for pydantic_ai_mlx-0.2.5.tar.gz
Algorithm Hash digest
SHA256 a8c401da5b8ef3a9878799ddfe86de949480f3e26eb71cb076b75bf727251fb4
MD5 36572ba32442bfee47c98e13fc325a54
BLAKE2b-256 d6e7ffbfbcee447bbf6ec33e180280703fe8246b2667a074060614c222473f6a

See more details on using hashes here.

File details

Details for the file pydantic_ai_mlx-0.2.5-py3-none-any.whl.

File metadata

File hashes

Hashes for pydantic_ai_mlx-0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 06a810722c4f6bfa12ecf80575147be8e2596b1695bc9ebdcf4693a6099e5e2e
MD5 4adcafc882852c1c8ce6d25feb1e9ebe
BLAKE2b-256 b14410e499d9a5ef599d8248a316591997ab55414dfd79971492cb984d00e688

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page