Python bindings for Apple's FoundationModels.framework
Project description
fm-rs - Python bindings for Apple FoundationModels
Python bindings for fm-rs, enabling on-device AI via Apple Intelligence.
Requirements
- macOS 26.0+ (Tahoe) on Apple Silicon (ARM64)
- Apple Intelligence enabled in System Settings
- Python 3.10+
Installation
pip install fm-rs
From Source
# Requires Rust toolchain
cd bindings/python
uv sync
uv run maturin develop
Quick Start
import fm
# Create the default system language model
model = fm.SystemLanguageModel()
# Check availability
if not model.is_available:
print("Apple Intelligence is not available")
exit(1)
# Create a session
session = fm.Session(model, instructions="You are a helpful assistant.")
# Send a prompt
response = session.respond("What is the capital of France?")
print(response.content)
Streaming
import fm
model = fm.SystemLanguageModel()
session = fm.Session(model)
# Stream the response
session.stream_response(
"Tell me a short story",
lambda chunk: print(chunk, end="", flush=True)
)
print() # newline at end
Structured Generation
import fm
model = fm.SystemLanguageModel()
session = fm.Session(model)
# Using a dict schema
schema = {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"}
},
"required": ["name", "age"]
}
person = session.respond_structured("Generate a fictional person", schema)
print(f"Name: {person['name']}, Age: {person['age']}")
# Using the Schema builder
schema = (fm.Schema.object()
.property("name", fm.Schema.string(), required=True)
.property("age", fm.Schema.integer().minimum(0), required=True))
person = session.respond_structured("Generate a fictional person", schema.to_dict())
Tool Calling
Tools allow the model to call external functions during generation.
import fm
class WeatherTool:
name = "get_weather"
description = "Gets the current weather for a location"
arguments_schema = {
"type": "object",
"properties": {
"city": {"type": "string", "description": "The city name"}
},
"required": ["city"]
}
def call(self, args):
city = args.get("city", "Unknown")
return f"Sunny, 72°F in {city}"
model = fm.SystemLanguageModel()
session = fm.Session(model, tools=[WeatherTool()])
response = session.respond("What's the weather in Paris?")
print(response.content)
Context Management
import fm
model = fm.SystemLanguageModel()
session = fm.Session(model)
# After some conversation...
limit = fm.ContextLimit.default_on_device()
usage = session.context_usage(limit)
print(f"Tokens used: {usage.estimated_tokens}/{usage.max_tokens}")
print(f"Utilization: {usage.utilization:.1%}")
if usage.over_limit:
# Compact the conversation
transcript = session.transcript_json
summary = fm.compact_transcript(model, transcript)
print(f"Summary: {summary}")
Error Handling
import fm
try:
model = fm.SystemLanguageModel()
model.ensure_available()
except fm.DeviceNotEligibleError:
print("This device doesn't support Apple Intelligence")
except fm.AppleIntelligenceNotEnabledError:
print("Please enable Apple Intelligence in Settings")
except fm.ModelNotReadyError:
print("Model is still downloading, try again later")
except fm.ModelNotAvailableError:
print("Model not available for unknown reason")
API Reference
Classes
SystemLanguageModel- Entry point for on-device AISession- Maintains conversation contextGenerationOptions- Controls generation (temperature, max_tokens, etc.)Response- Model outputToolOutput- Tool invocation resultContextLimit- Context window configurationContextUsage- Estimated token usageSchema- JSON Schema builder
Enums
Sampling-GreedyorRandomModelAvailability-Available,DeviceNotEligible,AppleIntelligenceNotEnabled,ModelNotReady,Unknown
Functions
estimate_tokens(text, chars_per_token=4)- Estimate token countcontext_usage_from_transcript(json, limit)- Get context usagetranscript_to_text(json)- Extract text from transcriptcompact_transcript(model, json)- Summarize conversation
Exceptions
FmError- Base exceptionModelNotAvailableErrorDeviceNotEligibleErrorAppleIntelligenceNotEnabledErrorModelNotReadyErrorGenerationErrorToolCallErrorJsonError
Notes
- Apple Silicon only: Wheels are built for macOS ARM64 only (Apple Silicon Macs)
- Tool callbacks: May be invoked from non-main threads; avoid UI work in callbacks
- Blocking calls: All calls block until completion; use streaming for long responses
- GIL: Callbacks run under the GIL; keep them short
Development
cd bindings/python
uv sync
uv run maturin develop
uv run pytest tests/
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fm_rs-0.1.4.tar.gz.
File metadata
- Download URL: fm_rs-0.1.4.tar.gz
- Upload date:
- Size: 82.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2d3be93180f9f7cd9f6ca04a1f05fa3d33c6a7570005dc24b6c51450c63ff4d1
|
|
| MD5 |
2bbcee12b27643d89c024242352852eb
|
|
| BLAKE2b-256 |
6d401d85ca16e2315fa0a291cec4c7bd2ea7cdb9293820b678092726c8871bad
|
Provenance
The following attestation bundles were made for fm_rs-0.1.4.tar.gz:
Publisher:
python-publish.yml on blacktop/fm-rs
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
fm_rs-0.1.4.tar.gz -
Subject digest:
2d3be93180f9f7cd9f6ca04a1f05fa3d33c6a7570005dc24b6c51450c63ff4d1 - Sigstore transparency entry: 844256330
- Sigstore integration time:
-
Permalink:
blacktop/fm-rs@df51c7435524467ed9f01a24ae0ad7a4d5ee914a -
Branch / Tag:
refs/tags/python-v0.1.4 - Owner: https://github.com/blacktop
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@df51c7435524467ed9f01a24ae0ad7a4d5ee914a -
Trigger Event:
push
-
Statement type:
File details
Details for the file fm_rs-0.1.4-cp310-abi3-macosx_11_0_arm64.whl.
File metadata
- Download URL: fm_rs-0.1.4-cp310-abi3-macosx_11_0_arm64.whl
- Upload date:
- Size: 485.3 kB
- Tags: CPython 3.10+, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3f01b89ddb20fe20c202410b6325d3d0284ec3fba352dbc26f2f2e4b0ccbf407
|
|
| MD5 |
3564f0ff32467d142389227e958cc9ff
|
|
| BLAKE2b-256 |
1d37ac42a63821e7d85f8f6bec193b6c620f02bb85a18536423290b25e5c983a
|
Provenance
The following attestation bundles were made for fm_rs-0.1.4-cp310-abi3-macosx_11_0_arm64.whl:
Publisher:
python-publish.yml on blacktop/fm-rs
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
fm_rs-0.1.4-cp310-abi3-macosx_11_0_arm64.whl -
Subject digest:
3f01b89ddb20fe20c202410b6325d3d0284ec3fba352dbc26f2f2e4b0ccbf407 - Sigstore transparency entry: 844256333
- Sigstore integration time:
-
Permalink:
blacktop/fm-rs@df51c7435524467ed9f01a24ae0ad7a4d5ee914a -
Branch / Tag:
refs/tags/python-v0.1.4 - Owner: https://github.com/blacktop
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@df51c7435524467ed9f01a24ae0ad7a4d5ee914a -
Trigger Event:
push
-
Statement type: