Skip to main content

OpenJarvis — modular AI assistant backend with composable intelligence primitives

Project description

OpenJarvis

Personal AI, On Personal Devices.

Project Docs Python License Discord


Documentation

Project Site

Leaderboard

Roadmap

Why OpenJarvis?

Personal AI agents are exploding in popularity, but nearly all of them still route intelligence through cloud APIs. Your "personal" AI continues to depend on someone else's server. At the same time, our Intelligence Per Watt research showed that local language models already handle 88.7% of single-turn chat and reasoning queries, with intelligence efficiency improving 5.3× from 2023 to 2025. The models and hardware are increasingly ready. What has been missing is the software stack to make local-first personal AI practical.

OpenJarvis is that stack. It is an opinionated framework for local-first personal AI, built around three core ideas: shared primitives for building on-device agents; evaluations that treat energy, FLOPs, latency, and dollar cost as first-class constraints alongside accuracy; and a learning loop that improves models using local trace data. The goal is simple: make it possible to build personal AI agents that run locally by default, calling the cloud only when truly necessary. OpenJarvis aims to be both a research platform and a production foundation for local AI, in the spirit of PyTorch.

Installation

Prerequisites

Tool Install
Python 3.10+ python.org
uv (Python package manager) curl -LsSf https://astral.sh/uv/install.sh | sh — or brew install uv on macOS
Rust curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
Git git-scm.com — or brew install git on macOS

macOS users: see the full macOS Installation Guide for a step-by-step walkthrough including Homebrew setup.

Setup

git clone https://github.com/open-jarvis/OpenJarvis.git
cd OpenJarvis
uv sync                           # core framework
uv sync --extra server             # + FastAPI server

# Build the Rust extension
uv run maturin develop -m rust/crates/openjarvis-python/Cargo.toml

Python 3.14+: set PYO3_USE_ABI3_FORWARD_COMPATIBILITY=1 before the maturin command.

You also need a local inference backend: Ollama, vLLM, SGLang, or llama.cpp. Alternatively, use the cloud engine with OpenAI, Anthropic, Google Gemini, OpenRouter, or MiniMax by setting the corresponding API key environment variable.

Quick Start

# 1. Install and detect hardware
git clone https://github.com/open-jarvis/OpenJarvis.git
cd OpenJarvis
uv sync
uv run jarvis init

# 2. Start Ollama and pull a model
curl -fsSL https://ollama.com/install.sh | sh
ollama serve &
ollama pull qwen3:8b

# 3. Ask a question
uv run jarvis ask "What is the capital of France?"

jarvis init auto-detects your hardware and recommends the best engine. Run uv run jarvis doctor at any time to diagnose issues.

Starter Configs

Install any preset with one command:

jarvis init --preset morning-digest-mac   # or any preset below
Preset Use Case What it does
morning-digest-mac Daily Briefing (Mac) Spoken briefing from email, calendar, health, news with Jarvis voice
morning-digest-linux Daily Briefing (Linux) Same, with vLLM support for GPU servers
morning-digest-minimal Daily Briefing (minimal) Just Gmail + Calendar, runs on any machine
deep-research Research Assistant Multi-hop research across indexed docs with citations
code-assistant Code Companion Agent with code execution, file I/O, and shell access
scheduled-monitor Persistent Monitor Stateful agent that runs on a schedule with memory
chat-simple Simple Chat Lightweight conversation, no tools needed
# Example: Morning Digest on Mac
jarvis init --preset morning-digest-mac
jarvis connect gdrive          # one OAuth flow covers Gmail, Calendar, Tasks
jarvis digest --fresh           # generate and play your first briefing

# Example: Deep Research
jarvis init --preset deep-research
jarvis memory index ./docs/    # index your documents
jarvis ask "Summarize all emails about Project X"

Skills

Skills teach agents how to better use tools and improve their reasoning. Every skill is a tool — agents discover them from a catalog and invoke them on demand.

# Install skills from public sources
jarvis skill install hermes:arxiv
jarvis skill sync hermes --category research

# Use skills with any agent
jarvis ask "Use the code-explainer skill to explain this Python code: for i in range(5): print(i*2)"

# Optimize skills from your trace history
jarvis optimize skills --policy dspy

# Benchmark the impact
jarvis bench skills --max-samples 5 --seeds 42

Import from Hermes Agent (~150 skills), OpenClaw (~13,700 community skills), or any GitHub repo. Skills follow the agentskills.io open standard.

See the Skills User Guide and Skills Tutorial for details.

Built-in Agents

Agent Type What it does
morning_digest Scheduled Daily briefing from email, calendar, health, news — with TTS audio
deep_research On-demand Multi-hop research with citations across web and local docs
monitor_operative Continuous Long-horizon monitoring with memory, compression, and retrieval
orchestrator On-demand Multi-turn reasoning with automatic tool selection
native_react On-demand ReAct (Thought-Action-Observation) loop agent
operative Continuous Persistent autonomous agent with state management
native_openhands On-demand CodeAct — generates and executes Python code
simple On-demand Single-turn chat, no tools

See the User Guide and Tutorials for detailed setup instructions.

Full documentation — including Docker deployment, cloud engines, development setup, and tutorials — at open-jarvis.github.io/OpenJarvis.

Contributing

We welcome contributions! See the Contributing Guide for incentives, contribution types, and the PR process.

Quick start for contributors:

git clone https://github.com/open-jarvis/OpenJarvis.git
cd OpenJarvis
uv sync --extra dev
uv run pre-commit install
uv run pytest tests/ -v

Browse the Roadmap for areas where help is needed. Comment "take" on any issue to get auto-assigned.

About

OpenJarvis is part of Intelligence Per Watt, a research initiative studying the efficiency of on-device AI systems. The project is developed at Hazy Research and the Scaling Intelligence Lab at Stanford SAIL.

Sponsors

Laude InstituteStanford MarloweGoogle Cloud PlatformLambda LabsOllamaIBM ResearchStanford HAI

Citation

@misc{saadfalcon2026openjarvis,
  title={OpenJarvis: Personal AI, On Personal Devices},
  author={Jon Saad-Falcon and Avanika Narayan and Herumb Shandilya and Hakki Orhun Akengin and Robby Manihani and Gabriel Bo and John Hennessy and Christopher R\'{e} and Azalia Mirhoseini},
  year={2026},
  howpublished={\url{https://scalingintelligence.stanford.edu/blogs/openjarvis/}},
}

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openjarvis-0.1.1.tar.gz (9.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openjarvis-0.1.1-py3-none-any.whl (1.4 MB view details)

Uploaded Python 3

File details

Details for the file openjarvis-0.1.1.tar.gz.

File metadata

  • Download URL: openjarvis-0.1.1.tar.gz
  • Upload date:
  • Size: 9.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for openjarvis-0.1.1.tar.gz
Algorithm Hash digest
SHA256 fee5df0145e420fd887e0a5eab673be1927bf8a22bbd1edc8d0db4564987d237
MD5 bf1d23bff8af23e8376a04a0302f819c
BLAKE2b-256 6a768cc8be54544a3276929b7e395227afc24db07621d45d107684c30aaa6717

See more details on using hashes here.

File details

Details for the file openjarvis-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: openjarvis-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 1.4 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for openjarvis-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7654db6558868b9df6443179676d12ab1331eb2e9c6215e72516227ed6659107
MD5 376198fd22c8eaf8dd11389a04b4ce33
BLAKE2b-256 2e41c84b080e0b782ce5235b02598b8a01feff863b3fa328253bdf4c36e3ed2e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page