Claude Code for local devices: chat with self-hosted models, run agent tools, and manage context safely ─ all without an internet connection.
Project description
Open-Jet
GitHub | X / Twitter | Discord
An AI coding agent that runs entirely on your machine.
This is Claude Code for local LLMs. OpenJet handles the model, runtime, and setup for you, so you can run a terminal coding agent on your own machine without fighting configuration. It reads files, edits code, runs commands, and keeps your work out of the cloud.
open-jet ships the full OpenJet package:
- the CLI and chat TUI
- the Python SDK
- the benchmarking entrypoints
Install it with:
pip install open-jet
OpenJet handles the model, runtime, and local setup without making you wire together llama.cpp, model files, and device-specific settings by hand. You get a coding agent in your terminal that can read files, edit code, run commands, and stay local.
This is not an SDK-only wheel. It installs the full OpenJet package, with openjet.sdk exposed as a supported import surface.
Product Surfaces
OpenJet has three primary surfaces in one package:
- CLI + chat TUI for interactive local agent work
- Python SDK for embedded sessions, hardware profiling, and auto-configuration
- Benchmarking for
llama-benchruns and sweep comparisons
Typical entrypoints:
open-jet
open-jet benchmark --sweep
from openjet.sdk import OpenJetSession, recommend_hardware_config
What You Get
- Read and edit local code from a terminal agent
- Run shell commands through the same session flow
- Resume sessions instead of losing context when the terminal closes
- Work on constrained hardware with hardware-aware model selection
- Use the Python SDK to embed the same runtime in your own app
- Run benchmarks against your current model/runtime profile
SDK Import Path
Use:
from openjet.sdk import recommend_hardware_config
or:
from openjet.sdk import OpenJetSession, create_agent
SDK Surface
The supported SDK surface includes:
from openjet.sdk import (
HardwareRecommendation,
HardwareRecommendationInput,
OpenJetSession,
RecommendedLlamaConfig,
RecommendedModel,
SDKEvent,
SDKEventKind,
SDKResponse,
ToolResult,
create_agent,
recommend_hardware_config,
)
That covers two main use cases:
- hardware/model recommendation for local
llama.cppsetups - embedded session/chat usage from your own Python application
Hardware Recommendation API
recommend_hardware_config() takes hardware input and returns:
- a recommended model
- recommended llama device settings
- recommended GPU layer count
- recommended context window size
- a token generation estimate for the recommended setup
Example:
from openjet.sdk import recommend_hardware_config
result = recommend_hardware_config(
{
"total_ram_gb": 16,
"gpu": "cuda",
"vram_mb": 24576,
"label": "RTX 4090 box",
}
)
print(result.model.label)
print(result.model.target_path)
print(result.llama.device)
print(result.llama.gpu_layers)
print(result.llama.context_window_tokens)
Typed input also works:
from openjet.sdk import HardwareRecommendationInput, recommend_hardware_config
result = recommend_hardware_config(
HardwareRecommendationInput(
total_ram_gb=8.0,
gpu="cpu",
hardware_profile="other",
hardware_override="desktop_8",
)
)
Session API
Use OpenJetSession when you want to embed OpenJet into another Python service, worker, or app.
Basic example:
import asyncio
from openjet.sdk import OpenJetSession
async def main() -> None:
session = await OpenJetSession.create()
try:
result = await session.run("Summarize the current README")
print(result.text)
finally:
await session.close()
asyncio.run(main())
The session API includes:
OpenJetSession.create(...)session.stream(...)session.run(...)session.set_airgapped(...)session.add_turn_context(...)session.clear_turn_context(...)create_agent(...)
The event/response types exposed for integrations are:
SDKEventSDKEventKindSDKResponseToolResult
CLI
This package also installs the CLI:
openjet
or:
open-jet
The CLI and the SDK share the same underlying package and runtime code.
Package Contents
This wheel currently includes:
openjet.sdkfor Python integrations- CLI entrypoints:
openjetandopen-jet - benchmark entrypoints via
open-jet benchmark - the local/session runtime used by both the SDK and the CLI
If you only need one narrow SDK feature, the package still installs the full declared dependency set for this distribution.
Repository
- Repository: github.com/l-forster/open-jet
- Issues: github.com/l-forster/open-jet/issues
License
open-jet core is licensed under Apache-2.0.
This package covers the permissive core SDK and CLI. Any future hosted, team, or enterprise offerings may be licensed separately.
External contributions are accepted under the contributor terms in
the repository's CONTRIBUTING.md and CLA.md.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file open_jet-0.4.1-py3-none-any.whl.
File metadata
- Download URL: open_jet-0.4.1-py3-none-any.whl
- Upload date:
- Size: 218.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fb833bccd489913fadc400708d2a7d395276d780d49882cce6667d1f74509d30
|
|
| MD5 |
0479469d99c76a6be2e212401e921615
|
|
| BLAKE2b-256 |
44bad19715e990f12cc97a57c956a088a746b0cc030b99b43fbc6700807e9232
|