OpenJarvis — modular AI assistant backend with composable intelligence primitives
Project description
Why OpenJarvis?
Personal AI agents are exploding in popularity, but nearly all of them still route intelligence through cloud APIs. Your "personal" AI continues to depend on someone else's server. At the same time, our Intelligence Per Watt research showed that local language models already handle 88.7% of single-turn chat and reasoning queries, with intelligence efficiency improving 5.3× from 2023 to 2025. The models and hardware are increasingly ready. What has been missing is the software stack to make local-first personal AI practical.
OpenJarvis is that stack. It is an opinionated framework for local-first personal AI, built around three core ideas: shared primitives for building on-device agents; evaluations that treat energy, FLOPs, latency, and dollar cost as first-class constraints alongside accuracy; and a learning loop that improves models using local trace data. The goal is simple: make it possible to build personal AI agents that run locally by default, calling the cloud only when truly necessary. OpenJarvis aims to be both a research platform and a production foundation for local AI, in the spirit of PyTorch.
Installation
Prerequisites
| Tool | Install |
|---|---|
| Python 3.10+ | python.org |
| uv (Python package manager) | curl -LsSf https://astral.sh/uv/install.sh | sh — or brew install uv on macOS |
| Rust | curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh |
| Git | git-scm.com — or brew install git on macOS |
macOS users: see the full macOS Installation Guide for a step-by-step walkthrough including Homebrew setup.
Setup
git clone https://github.com/open-jarvis/OpenJarvis.git
cd OpenJarvis
uv sync # core framework
uv sync --extra server # + FastAPI server
# Build the Rust extension
uv run maturin develop -m rust/crates/openjarvis-python/Cargo.toml
Python 3.14+: set
PYO3_USE_ABI3_FORWARD_COMPATIBILITY=1before thematurincommand.
You also need a local inference backend: Ollama, vLLM, SGLang, or llama.cpp. Alternatively, use the cloud engine with OpenAI, Anthropic, Google Gemini, OpenRouter, or MiniMax by setting the corresponding API key environment variable.
Quick Start
# 1. Install and detect hardware
git clone https://github.com/open-jarvis/OpenJarvis.git
cd OpenJarvis
uv sync
uv run jarvis init
# 2. Start Ollama and pull a model
curl -fsSL https://ollama.com/install.sh | sh
ollama serve &
ollama pull qwen3:8b
# 3. Ask a question
uv run jarvis ask "What is the capital of France?"
jarvis init auto-detects your hardware and recommends the best engine. Run uv run jarvis doctor at any time to diagnose issues.
Starter Configs
Install any preset with one command:
jarvis init --preset morning-digest-mac # or any preset below
| Preset | Use Case | What it does |
|---|---|---|
morning-digest-mac |
Daily Briefing (Mac) | Spoken briefing from email, calendar, health, news with Jarvis voice |
morning-digest-linux |
Daily Briefing (Linux) | Same, with vLLM support for GPU servers |
morning-digest-minimal |
Daily Briefing (minimal) | Just Gmail + Calendar, runs on any machine |
deep-research |
Research Assistant | Multi-hop research across indexed docs with citations |
code-assistant |
Code Companion | Agent with code execution, file I/O, and shell access |
scheduled-monitor |
Persistent Monitor | Stateful agent that runs on a schedule with memory |
chat-simple |
Simple Chat | Lightweight conversation, no tools needed |
# Example: Morning Digest on Mac
jarvis init --preset morning-digest-mac
jarvis connect gdrive # one OAuth flow covers Gmail, Calendar, Tasks
jarvis digest --fresh # generate and play your first briefing
# Example: Deep Research
jarvis init --preset deep-research
jarvis memory index ./docs/ # index your documents
jarvis ask "Summarize all emails about Project X"
Skills
Skills teach agents how to better use tools and improve their reasoning. Every skill is a tool — agents discover them from a catalog and invoke them on demand.
# Install skills from public sources
jarvis skill install hermes:arxiv
jarvis skill sync hermes --category research
# Use skills with any agent
jarvis ask "Use the code-explainer skill to explain this Python code: for i in range(5): print(i*2)"
# Optimize skills from your trace history
jarvis optimize skills --policy dspy
# Benchmark the impact
jarvis bench skills --max-samples 5 --seeds 42
Import from Hermes Agent (~150 skills), OpenClaw (~13,700 community skills), or any GitHub repo. Skills follow the agentskills.io open standard.
See the Skills User Guide and Skills Tutorial for details.
Built-in Agents
| Agent | Type | What it does |
|---|---|---|
morning_digest |
Scheduled | Daily briefing from email, calendar, health, news — with TTS audio |
deep_research |
On-demand | Multi-hop research with citations across web and local docs |
monitor_operative |
Continuous | Long-horizon monitoring with memory, compression, and retrieval |
orchestrator |
On-demand | Multi-turn reasoning with automatic tool selection |
native_react |
On-demand | ReAct (Thought-Action-Observation) loop agent |
operative |
Continuous | Persistent autonomous agent with state management |
native_openhands |
On-demand | CodeAct — generates and executes Python code |
simple |
On-demand | Single-turn chat, no tools |
See the User Guide and Tutorials for detailed setup instructions.
Full documentation — including Docker deployment, cloud engines, development setup, and tutorials — at open-jarvis.github.io/OpenJarvis.
Contributing
We welcome contributions! See the Contributing Guide for incentives, contribution types, and the PR process.
Quick start for contributors:
git clone https://github.com/open-jarvis/OpenJarvis.git
cd OpenJarvis
uv sync --extra dev
uv run pre-commit install
uv run pytest tests/ -v
Browse the Roadmap for areas where help is needed. Comment "take" on any issue to get auto-assigned.
About
OpenJarvis is part of Intelligence Per Watt, a research initiative studying the efficiency of on-device AI systems. The project is developed at Hazy Research and the Scaling Intelligence Lab at Stanford SAIL.
Sponsors
Laude Institute • Stanford Marlowe • Google Cloud Platform • Lambda Labs • Ollama • IBM Research • Stanford HAI
Citation
@misc{saadfalcon2026openjarvis,
title={OpenJarvis: Personal AI, On Personal Devices},
author={Jon Saad-Falcon and Avanika Narayan and Herumb Shandilya and Hakki Orhun Akengin and Robby Manihani and Gabriel Bo and John Hennessy and Christopher R\'{e} and Azalia Mirhoseini},
year={2026},
howpublished={\url{https://scalingintelligence.stanford.edu/blogs/openjarvis/}},
}
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openjarvisai-0.1.0.tar.gz.
File metadata
- Download URL: openjarvisai-0.1.0.tar.gz
- Upload date:
- Size: 9.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6f8f059056b2f2f235493dbd7a173147a689043068ce99e318c2bc8f9dc38969
|
|
| MD5 |
a84a75026a9c235e2ee98d92f74fac8a
|
|
| BLAKE2b-256 |
cd385388b1973bf55c057f4ccd02c992448ff00c0bac98a8952a7a3445f9f768
|
File details
Details for the file openjarvisai-0.1.0-py3-none-any.whl.
File metadata
- Download URL: openjarvisai-0.1.0-py3-none-any.whl
- Upload date:
- Size: 1.4 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2ab09223e36a9e29880c2e5f3c44713c052f6c13d81873a17606cfb13bdfbae8
|
|
| MD5 |
4950879b2236ac0c9548d4e98a01e915
|
|
| BLAKE2b-256 |
7cae54347277ee80b249cfc2f6f3eb36ab1b36b07795cc4202144f4a0248edc5
|