Skip to main content

OpenAI's response format for its open-weight model series gpt-oss

Project description

harmony

OpenAI Harmony

OpenAI's response format for its open-weight model series gpt-oss
Try gpt-oss | Learn more | Model card


The gpt-oss models were trained on the harmony response format for defining conversation structures, generating reasoning output and structuring function calls. If you are not using gpt-oss directly but through an API or a provider like HuggingFace, Ollama, or vLLM, you will not have to be concerned about this as your inference solution will handle the formatting. If you are building your own inference solution, this guide will walk you through the prompt format. The format is designed to mimic the OpenAI Responses API, so if you have used that API before, this format should hopefully feel familiar to you. gpt-oss should not be used without using the harmony format as it will not work correctly.

The format enables the model to output to multiple different channels for chain of thought, and tool calling preambles along with regular responses. It also enables specifying various tool namespaces, and structured outputs along with a clear instruction hierarchy. Check out the guide to learn more about the format itself.

<|start|>system<|message|>You are ChatGPT, a large language model trained by OpenAI.
Knowledge cutoff: 2024-06
Current date: 2025-06-28

Reasoning: high

# Valid channels: analysis, commentary, final. Channel must be included for every message.
Calls to these tools must go to the commentary channel: 'functions'.<|end|>

<|start|>developer<|message|># Instructions

Always respond in riddles

# Tools

## functions

namespace functions {

// Gets the location of the user.
type get_location = () => any;

// Gets the current weather in the provided location.
type get_current_weather = (_: {
// The city and state, e.g. San Francisco, CA
location: string,
format?: "celsius" | "fahrenheit", // default: celsius
}) => any;

} // namespace functions<|end|><|start|>user<|message|>What is the weather like in SF?<|end|><|start|>assistant

We recommend using this library when working with models that use the harmony response format

  • Consistent formatting – shared implementation for rendering and parsing keeps token-sequences loss-free.
  • Blazing fast – heavy lifting happens in Rust.
  • First-class Python support – install with pip, typed stubs included, 100 % test parity with the Rust suite.

Using Harmony

Python

Check out the full documentation

Installation

Install the package from PyPI by running

pip install openai-harmony
# or if you are using uv
uv pip install openai-harmony

Example

from openai_harmony import (
    load_harmony_encoding,
    HarmonyEncodingName,
    Role,
    Message,
    Conversation,
    DeveloperContent,
    SystemContent,
)
enc = load_harmony_encoding(HarmonyEncodingName.HARMONY_GPT_OSS)
convo = Conversation.from_messages([
    Message.from_role_and_content(
        Role.SYSTEM,
        SystemContent.new(),
    ),
    Message.from_role_and_content(
        Role.DEVELOPER,
        DeveloperContent.new().with_instructions("Talk like a pirate!")
    ),
    Message.from_role_and_content(Role.USER, "Arrr, how be you?"),
])
tokens = enc.render_conversation_for_completion(convo, Role.ASSISTANT)
print(tokens)
# Later, after the model responded …
parsed = enc.parse_messages_from_completion_tokens(tokens, role=Role.ASSISTANT)
print(parsed)

Rust

Check out the full documentation

Installation

Add the dependency to your Cargo.toml

[dependencies]
openai-harmony = { git = "https://github.com/openai/harmony" }

Example

use openai_harmony::chat::{Message, Role, Conversation};
use openai_harmony::{HarmonyEncodingName, load_harmony_encoding};

fn main() -> anyhow::Result<()> {
    let enc = load_harmony_encoding(HarmonyEncodingName::HarmonyGptOss)?;
    let convo =
        Conversation::from_messages([Message::from_role_and_content(Role::User, "Hello there!")]);
    let tokens = enc.render_conversation_for_completion(&convo, Role::Assistant, None)?;
    println!("{:?}", tokens);
    Ok(())
}

Contributing

The majority of the rendering and parsing is built in Rust for performance and exposed to Python through thin pyo3 bindings.

┌──────────────────┐      ┌───────────────────────────┐
│  Python code     │      │  Rust core (this repo)    │
│  (dataclasses,   │────► │  • chat / encoding logic  │
│   convenience)   │      │  • tokeniser (tiktoken)   │
└──────────────────┘  FFI └───────────────────────────┘

Repository layout

.
├── src/                  # Rust crate
│   ├── chat.rs           # High-level data-structures (Role, Message, …)
│   ├── encoding.rs       # Rendering & parsing implementation
│   ├── registry.rs       # Built-in encodings
│   ├── tests.rs          # Canonical Rust test-suite
│   └── py_module.rs      # PyO3 bindings ⇒ compiled as openai_harmony.*.so
│
├── python/openai_harmony/ # Pure-Python wrapper around the binding
│   └── __init__.py       # Dataclasses + helper API mirroring chat.rs
│
├── tests/                # Python test-suite (1-to-1 port of tests.rs)
├── Cargo.toml            # Rust package manifest
├── pyproject.toml        # Python build configuration for maturin
└── README.md             # You are here 🖖

Developing locally

Prerequisites

  • Rust tool-chain (stable) – https://rustup.rs
  • Python ≥ 3.8 + virtualenv/venv
  • maturin – build tool for PyO3 projects

1. Clone & bootstrap

git clone https://github.com/openai/harmony.git
cd harmony
# Create & activate a virtualenv
python -m venv .venv
source .venv/bin/activate
# Install maturin and test dependencies
pip install maturin pytest mypy ruff  # tailor to your workflow
# Compile the Rust crate *and* install the Python package in editable mode
maturin develop --release

maturin develop builds harmony with Cargo, produces a native extension (openai_harmony.<abi>.so) and places it in your virtualenv next to the pure- Python wrapper – similar to pip install -e . for pure Python projects.

2. Running the test-suites

Rust:

cargo test          # runs src/tests.rs

Python:

pytest              # executes tests/ (mirrors the Rust suite)

Run both in one go to ensure parity:

pytest && cargo test

3. Type-checking & formatting (optional)

mypy harmony        # static type analysis
ruff check .        # linting
cargo fmt --all     # Rust formatter

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_harmony-0.0.8.tar.gz (284.8 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

openai_harmony-0.0.8-cp38-abi3-win_amd64.whl (2.4 MB view details)

Uploaded CPython 3.8+Windows x86-64

openai_harmony-0.0.8-cp38-abi3-win32.whl (2.1 MB view details)

Uploaded CPython 3.8+Windows x86

openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_x86_64.whl (3.2 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ x86-64

openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_i686.whl (3.0 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ i686

openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_armv7l.whl (2.9 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ ARMv7l

openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_aarch64.whl (3.1 MB view details)

Uploaded CPython 3.8+musllinux: musl 1.2+ ARM64

openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ x86-64

openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl (3.4 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ppc64le

openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl (3.0 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ i686

openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl (2.7 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ARMv7l

openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (2.9 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ARM64

openai_harmony-0.0.8-cp38-abi3-macosx_11_0_arm64.whl (2.6 MB view details)

Uploaded CPython 3.8+macOS 11.0+ ARM64

File details

Details for the file openai_harmony-0.0.8.tar.gz.

File metadata

  • Download URL: openai_harmony-0.0.8.tar.gz
  • Upload date:
  • Size: 284.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: maturin/1.9.6

File hashes

Hashes for openai_harmony-0.0.8.tar.gz
Algorithm Hash digest
SHA256 6e43f98e6c242fa2de6f8ea12eab24af63fa2ed3e89c06341fb9d92632c5cbdf
MD5 eecebef0e38fa73e6c114c7050931b19
BLAKE2b-256 3e922d038d096f29179c7c9571b431f9e739f87a487121901725e23fe338dd9d

See more details on using hashes here.

File details

Details for the file openai_harmony-0.0.8-cp38-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for openai_harmony-0.0.8-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 39d44f0d8f466bd56698e7ead708bead3141e27b9b87e3ab7d5a6d0e4a869ee5
MD5 e90e7b5cff54d5defe49253135693915
BLAKE2b-256 401fc83cf5a206c263ee70448a5ae4264682555f4d0b5bed0d2cc6ca1108103d

See more details on using hashes here.

File details

Details for the file openai_harmony-0.0.8-cp38-abi3-win32.whl.

File metadata

File hashes

Hashes for openai_harmony-0.0.8-cp38-abi3-win32.whl
Algorithm Hash digest
SHA256 a9b5f893326b28d9e935ade14b4f655f5a840942473bc89b201c25f7a15af9cf
MD5 ed0aeea626c2e149fb21d1a15381f3e7
BLAKE2b-256 1463119de431572d7c70a7bf1037034a9be6ed0a7502a7498ba7302bca5b3242

See more details on using hashes here.

File details

Details for the file openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 007b0476a1f331f8130783f901f1da6f5a7057af1a4891f1b6a31dec364189b5
MD5 275f0d220f559932763f0ad6c84fd35a
BLAKE2b-256 60c33d1e01e2dba517a91760e4a03e4f20ffc75039a6fe584d0e6f9b5c78fd15

See more details on using hashes here.

File details

Details for the file openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_i686.whl.

File metadata

File hashes

Hashes for openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 772922a9bd24e133950fad71eb1550836f415a88e8c77870e12d0c3bd688ddc2
MD5 606d4fe5d31ffdd944dcac7371102537
BLAKE2b-256 8ac81774eec4f6f360ef57618fb8f52e3d3af245b2491bd0297513aa09eec04b

See more details on using hashes here.

File details

Details for the file openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_armv7l.whl.

File metadata

File hashes

Hashes for openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_armv7l.whl
Algorithm Hash digest
SHA256 cbaa3bda75ef0d8836e1f8cc84af62f971b1d756d740efc95c38c3e04c0bfde2
MD5 b5f1dd2eafc1bc0163ba0be61e239bd1
BLAKE2b-256 1d104327dbf87f75ae813405fd9a9b4a5cde63d506ffed0a096a440a4cabd89c

See more details on using hashes here.

File details

Details for the file openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_aarch64.whl.

File metadata

File hashes

Hashes for openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 8565d4f5a0638da1bffde29832ed63c9e695c558611053add3b2dc0b56c92dbc
MD5 b38ba2058e0bccc4ba2b242d9b4c45d3
BLAKE2b-256 5bf893b582cad3531797c3db7c2db5400fd841538ccddfd9f5e3df61be99a630

See more details on using hashes here.

File details

Details for the file openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 c007d277218a50db8839e599ed78e0fffe5130f614c3f6d93ae257f282071a29
MD5 bf68027623be758f16150b7956d20903
BLAKE2b-256 253f1a192b93bb47c6b44cd98ba8cc1d3d2a9308f1bb700c3017e6352da11bda

See more details on using hashes here.

File details

Details for the file openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl.

File metadata

File hashes

Hashes for openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
Algorithm Hash digest
SHA256 b4d5cfa168e74d08f8ba6d58a7e49bc7daef4d58951ec69b66b0d56f4927a68d
MD5 fe129cccdcc24531352e93c94d22e9c2
BLAKE2b-256 113c33f3374e4624e0e776f6b13b73c45a7ead7f9c4529f8369ed5bfcaa30cac

See more details on using hashes here.

File details

Details for the file openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 0a3a16972aa1cee38ea958470cd04ac9a2d5ac38fdcf77ab686611246220c158
MD5 1d669e576f34a925bb778e0a0989a7a6
BLAKE2b-256 9baf4eec8f9ab9c27bcdb444460c72cf43011d176fc44c79d6e113094ca1e152

See more details on using hashes here.

File details

Details for the file openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl.

File metadata

File hashes

Hashes for openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
Algorithm Hash digest
SHA256 5cfcfd963b50a41fc656c84d3440ca6eecdccd6c552158ce790b8f2e33dfb5a9
MD5 40e48be8ca7937286ed8c0c1d9d1966f
BLAKE2b-256 fa4cb553c9651662d6ce102ca7f3629d268b23df1abe5841e24bed81e8a8e949

See more details on using hashes here.

File details

Details for the file openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 e4f709815924ec325b9a890e6ab2bbb0ceec8e319a4e257328eb752cf36b2efc
MD5 78de0c97ca49a84f91ce757ca4a46047
BLAKE2b-256 d3d2ce6953ca87db9cae3e775024184da7d1c5cb88cead19a2d75b42f00a959c

See more details on using hashes here.

File details

Details for the file openai_harmony-0.0.8-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for openai_harmony-0.0.8-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 029ec25ca74abe48fdb58eb9fdd2a8c1618581fc33ce8e5653f8a1ffbfbd9326
MD5 e0631b5263a89fee2cb7edd2c281626a
BLAKE2b-256 45c62502f416d46be3ec08bb66d696cccffb57781a499e3ff2e4d7c174af4e8f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page