Skip to main content

Offline local-LLM terminal app for Jetson and edge Linux: chat with on-device models, run agent tools, and manage context safely.

Project description

open-jet

open-jet is an offline-first terminal app for running local LLM workflows on edge Linux devices (including Jetson-class hardware).

It provides:

  • local chat with your on-device model
  • safe file-context loading with token/memory guards
  • slash commands for session control
  • first-run setup for model/runtime configuration
  • optional session logging and resume

Requirements

Before running open-jet, make sure:

  • llama-server from llama.cpp is installed and available on PATH
  • you have a local .gguf model file, or ollama installed for model download

Install

pip install open-jet

Start

open-jet

Optional setup screen on launch:

open-jet --setup

First-Run Setup

On first run, open-jet guides you through:

  1. hardware detection/profile
  2. model source selection
  3. model path or download choice
  4. context window size
  5. GPU offload configuration

It then saves your configuration and starts the runtime.

Basic Use

  • Type normally and press Enter to chat
  • Use @file or @[path with spaces] to add file content to context
  • Type / to open slash-command suggestions
  • Tab/Enter can autocomplete slash commands and file mentions
  • Ctrl+C or /exit quits

Slash Commands

  • /help show commands
  • /exit quit app
  • /clear clear chat and restart llama-server
  • /clear-chat clear chat only
  • /status show context/RAM status
  • /condense condense older context
  • /load <path> load a file into context
  • /resume load previous saved session
  • /setup reopen setup wizard

Configuration

Main settings are stored in config.yaml, including:

  • context window size
  • memory guard limits
  • logging settings
  • session state/resume settings

Logging and Session State

When enabled:

  • session events are written to session_logs/*.events.jsonl
  • system metrics are written to session_logs/*.metrics.jsonl
  • conversation state is saved to session_state.json

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

open_jet-0.1.2-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (5.3 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ ARM64

File details

Details for the file open_jet-0.1.2-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for open_jet-0.1.2-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 e759a41ad7f089d79338f5e972ba74719b5a9b10017871948737941314a6d318
MD5 d4a2976d4785f29900c0ea751ac46128
BLAKE2b-256 799f36753aac2c136e2566939b9048ff47ed8d2bfae32e15d7b623b0bc48dca5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page