Skip to main content

MLX integration for the Agent Framework

Project description

Agent Framework MLX

This library provides an Apple MLX LM backend for the Agent Framework. It allows you to run Language Models locally on macOS with Apple Silicon using the mlx-lm library.

Features

  • Local Inference: Run models directly on your Mac using Apple Silicon
  • Tool Calling: Native support for agentic tool invocation.
  • Observability: Built-in OpenTelemetry tracing, metrics, and token usage tracking.
  • Streaming Support: Full support for streaming responses.
  • Configurable Generation: Fine-tune generation parameters like temperature, top-p, repetition penalty, and more.
  • Message Preprocessing: Hook into the pipeline to modify messages before they are converted to prompts.
  • Agent Framework Integration: Seamlessly plugs into the Agent Framework's BaseChatClient interface.

Installation

Ensure you have Python 3.9+ and are running on macOS with Apple Silicon.

pip install agent-framework-mlx

Or install from source:

git clone https://github.com/filipw/agent-framework-mlx.git
cd agent-framework-mlx
pip install -e .

Usage

Basic Example

import asyncio
from agent_framework import ChatMessage, Role, ChatOptions
from agent_framework_mlx import MLXChatClient, MLXGenerationConfig

# Initialize the client
client = MLXChatClient(
    model_path="mlx-community/Phi-4-mini-instruct-4bit",
    generation_config=MLXGenerationConfig(
        temp=0.7,
        max_tokens=500
    )
)

# Create messages
messages = [
    ChatMessage(role=Role.SYSTEM, text="You are a helpful assistant."),
    ChatMessage(role=Role.USER, text="Why is the sky blue?")
]

# Get response
response = await client.get_response(messages=messages)
print(response.text)

Agent with Tools

You can easily create agents capable of calling tools (Python functions) using the client:

from typing import Annotated

def calculate_bmi(weight: float, height: float) -> str:
    """Calculates BMI."""
    return f"{weight / (height ** 2):.2f}"

# Create an agent with the tool
agent = client.as_agent(
    name="HealthAssistant",
    instructions="You are a helpful assistant.",
    tools=[calculate_bmi]
)

response = await agent.run("Calculate BMI for 70kg and 1.75m")
print(response)

Workflow Integration

You can use the client as backbone for Agent Framework agents when building agentic workflows:

from agent_framework import ChatAgent

# notice the client constructed in the previous example now backs the local agent
local_agent = client.as_agent(
    name="Local_MLX",
    instructions="You are a helpful assistant."
)

remote_agent = ChatAgent(
    name="Cloud_LLM",
    instructions="You are a fallback expert. The previous assistant was unsure. Provide a complete answer.",
    chat_client=azure_client
)

builder = WorkflowBuilder()
builder.set_start_executor(local_agent)

builder.add_edge(
    source=local_agent,
    target=remote_agent,
    condition=should_fallback_to_cloud
)

workflow = builder.build()

Streaming

async for update in client.get_streaming_response(messages=messages):
    print(update.text, end="", flush=True)

Configuration

You can configure the client using environment variables or a .env file. Using environment variables like MLX_MODEL_PATH allows you to omit arguments in code.

export MLX_MODEL_PATH="mlx-community/Phi-4-mini-instruct-4bit"
# No arguments needed if env vars are set
client = MLXChatClient()

Advanced Configuration

You can configure generation parameters globally via MLXGenerationConfig or per-request via MLXChatOptions.

config = MLXGenerationConfig(
    temp=0.7,
    top_p=0.9,
    repetition_penalty=1.1,
    seed=42
)

Message Preprocessing

You can intercept and modify messages before they are sent to the model. This is useful for injecting instructions or formatting content.

def inject_instruction(messages):
    if messages:
        messages[-1]["content"] += "\nIMPORTANT: Be concise."
    return messages

client = MLXChatClient(
    model_path="...",
    message_preprocessor=inject_instruction
)

Requirements

  • macOS
  • Apple Silicon
  • Python 3.9+

License

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agent_framework_mlx-0.5.0.tar.gz (10.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agent_framework_mlx-0.5.0-py3-none-any.whl (7.3 kB view details)

Uploaded Python 3

File details

Details for the file agent_framework_mlx-0.5.0.tar.gz.

File metadata

  • Download URL: agent_framework_mlx-0.5.0.tar.gz
  • Upload date:
  • Size: 10.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for agent_framework_mlx-0.5.0.tar.gz
Algorithm Hash digest
SHA256 653bc86bc94e68649108dd5684bf8b4a0c16adfd79a219205151caba4ef0dbf8
MD5 da6acf06c4aee01c03d2847aba9600f8
BLAKE2b-256 9f4403ccba3bc1e2b5711fabf7acb93c38bb1eff56ccd284a11a882f921d40ac

See more details on using hashes here.

Provenance

The following attestation bundles were made for agent_framework_mlx-0.5.0.tar.gz:

Publisher: publish.yml on filipw/agent-framework-mlx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file agent_framework_mlx-0.5.0-py3-none-any.whl.

File metadata

File hashes

Hashes for agent_framework_mlx-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 601eab1afd9232ec9249a603fe120a34e4a92c28b4fb81f517d92d5060ea4f35
MD5 d8156ab442d3bc0d8eed7a7a26dfd2bd
BLAKE2b-256 d210a1b7a486d06283660db2339f69ccb326b5515136cc462c796aea70c04c93

See more details on using hashes here.

Provenance

The following attestation bundles were made for agent_framework_mlx-0.5.0-py3-none-any.whl:

Publisher: publish.yml on filipw/agent-framework-mlx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page