Skip to main content

Interactive Composition Explorer

Project description

Interactive Composition Explorer 🧊

ICE is a Python library and trace visualizer for language model programs.

Screenshot

ice-screenshot Execution trace visualized in ICE

Features

  • Run language model recipes in different modes: humans, human+LM, LM
  • Inspect the execution traces in your browser for debugging
  • Define and use new language model agents, e.g. chain-of-thought agents
  • Run recipes quickly by parallelizing language model calls
  • Reuse component recipes such as question-answering, ranking, and verification

ICE is pre-1.0

:warning: The ICE API may change at any point. The ICE interface is being actively developed and we may change the API at any point, including removing functionality, renaming methods, splitting ICE into multiple projects, and other similarly disruptive changes. Use at your own risk.

Requirements

ICE requires Python 3.9, 3.10, or 3.11. If you don't have a supported version of Python installed, we recommend using pyenv to install a supported Python version and manage multiple Python versions.

If you use Windows, you'll need to run ICE inside of WSL.

Getting started

  1. As part of general good Python practice, consider first creating and activating a virtual environment to avoid installing ICE 'globally'. For example:

    python -m venv venv
    source venv/bin/activate
    
  2. Install ICE:

    pip install ought-ice
    
  3. Run the Hello World recipe in the Primer to see the trace rendered.

  4. Optionally, set secrets (like your OpenAI API key) in ~/.ought-ice/.env. See .env.example for the format. If these are not set, you'll be prompted for them when you run recipes that need them.

Developing ICE

  1. If you want to make changes to ICE itself, clone the repository, then install it in editable mode:

    python -m venv venv
    source venv/bin/activate
    pip install --upgrade pip
    pip install -e '.[dev]' --config-settings editable_mode=compat
    pre-commit install
    npm --prefix ui ci
    npm --prefix ui run dev
    
  2. If you're working on the backend, you might find it helpful to remove the cache of language model calls:

    rm -r ~/.ought-ice/cache
    
  3. pre-commit complains if your code doesn't pass certain checks. It runs when you commit, and will possibly reject your commit and make you have to fix the problem(s) before you can commit again. (So you should probably use the same commit message you used the first time.)

Note that you don't technically need to run pre-commit install, but not doing so may cause your commits to fail CI. (Which can be noisy, including by generating commits that will e.g. fix formatting.)

Storybook

We use Storybook for UI tests. You can run them locally:

npm --prefix ui run storybook

Note that build-storybook is only for CI and shouldn't be run locally.

Terminology

  • Recipes are decompositions of a task into subtasks.

    The meaning of a recipe is: If a human executed these steps and did a good job at each workspace in isolation, the overall answer would be good. This decomposition may be informed by what we think ML can do at this point, but the recipe itself (as an abstraction) doesn’t know about specific agents.

  • Agents perform atomic subtasks of predefined shapes, like completion, scoring, or classification.

    Agents don't know which recipe is calling them. Agents don’t maintain state between subtasks. Agents generally try to complete all subtasks they're asked to complete (however badly), but some will not have implementations for certain task types.

  • The mode in which a recipe runs is a global setting that can affect every agent call. For instance, whether to use humans or agents. Recipes can also run with certain RecipeSettings, which can map a task type to a specific agent_name, which can modify which agent is used for that specific type of task.

Additional resources

  1. Join the ICE Slack channel to collaborate with other people composing language model tasks. You can also use it to ask questions about using ICE.

  2. Watch the recording of Ought's Lab Meeting to understand the high-level goals for ICE, how it interacts with Ought's other work, and how it contributes to alignment research.

  3. Read the ICE announcement post for another introduction.

Contributions

ICE is an open-source project by Ought. We're an applied ML lab building the AI research assistant Elicit.

We welcome community contributions:

  • If you're a developer, you can dive into the codebase and help us fix bugs, improve code quality and performance, or add new features.
  • If you're a language model researcher, you can help us add new agents or improve existing ones, and refine or create new recipes and recipe components.

For larger contributions, make an issue for discussion before submitting a PR.

And for even larger contributions, join us - we're hiring!

How to cite

If you use ICE, please cite:

Iterated Decomposition: Improving Science Q&A by Supervising Reasoning Processes. Justin Reppert, Ben Rachbach, Charlie George, Luke Stebbing Jungwon Byun, Maggie Appleton, Andreas Stuhlmüller (2023). Ought Technical Report. arXiv:2301.01751 [cs.CL]

Bibtex:

@article{reppert2023iterated,
  author = {Justin Reppert and Ben Rachbach and Charlie George and Luke Stebbing and Jungwon Byun and Maggie Appleton and Andreas Stuhlm\"{u}ller},
  archivePrefix = {arXiv},
  eprint = {2301.01751},
  primaryClass = {cs.CL},
  title = {Iterated Decomposition: Improving Science Q&A by Supervising Reasoning Processes},
  year = 2023,
  keywords = {language models, decomposition, workflow, debugging},
  url = {https://arxiv.org/abs/2301.01751}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ought-ice-0.5.0.tar.gz (1.8 MB view details)

Uploaded Source

Built Distribution

ought_ice-0.5.0-py3-none-any.whl (1.1 MB view details)

Uploaded Python 3

File details

Details for the file ought-ice-0.5.0.tar.gz.

File metadata

  • Download URL: ought-ice-0.5.0.tar.gz
  • Upload date:
  • Size: 1.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.4

File hashes

Hashes for ought-ice-0.5.0.tar.gz
Algorithm Hash digest
SHA256 96768e80ce6fce194b90f74d9ec99fb05b9dfc04f79aeed2dab751d277155c2e
MD5 b6cebaf11756cc080ca54e6122b31bad
BLAKE2b-256 8356502ad059b62ccd7276cb987c0f16883bd33fbbc2cbe3777eb1f7f6766064

See more details on using hashes here.

File details

Details for the file ought_ice-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: ought_ice-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.4

File hashes

Hashes for ought_ice-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0caa08d95b3360cd8d30490e5c53cd98a1ffdd9d2e10efe187d26944c2a19655
MD5 156ca531cd0678caf6c6e17af207f87b
BLAKE2b-256 2bbd08a43d4c41f4d94aab448764919f5225558ebe1cc9202f1b370e6708a6bf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page