Skip to main content

Updated and improved implementation of the self-instruct system.

Project description

airoboros: using large language models to fine-tune large language models

This is my take on implementing the Self-Instruct paper. The approach is quite heavily modified, and does not use any human-generated seeds.

This updated implementation supports either the /v1/completions endpoint or /v1/chat/completions, which is particularly useful in that it supports gpt-4 and gpt-3.5-turbo (which is 1/10 the cost of text-davinci-003).

Key differences

  • support for either /v1/completions or /v1/chat/completions APIs (which allows gpt-3.5-turbo instead of text-davinci-003, as well as gpt-4 if you have access)
  • support for custom topics list, custom topic generation prompt, or completely random topics
  • in-memory vector db (Chroma) for similarity comparison, which is much faster than calculating rouge score for each generated instruction
  • (seemingly) better prompts, which includes injection of random topics to relate the instructions to, which creates much more diverse synthetic instructions
  • asyncio producers with configurable batch size
  • several "instructors", each targetting specific use-cases, such as Orca style reasoning/math, role playing, etc.
  • tries to ensure the context, if provided, is relevant to the topic and contains all the information that would be necessary to respond to the instruction, and nost just a link to article/etc.
  • generally speaking, this implementation tries to reduce some of the noise

Generating instructions

NEW - 2023-07-18

To better accomodate the plethora of options, the configuration has been moved to a YAML config file.

Please create a copy of example-config.yaml and configure as desired.

Once you have the desired configuration, run:

airoboros generate-instructions --config-path /path/to/config.yaml

Generating topics

NEW - 2023-07-18

Again, this is now all YAML configuration based! Please create a customized version of the YAML config file, then run:

airoboros generate-topics --config-path /path/to/config.yaml

You can override the topic_prompt string in the configuration to use a different topic generation prompt.

Support the work

https://bmc.link/jondurbin

Models (research use only):

gpt-4 versions

gpt-3.5-turbo versions

Datasets (subject to OpenAI license):

Coming soon

Scripts for fine-tuning various models using the self-instruct (and human-generated) prompts.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airoboros-2.0.12.tar.gz (39.5 kB view details)

Uploaded Source

Built Distribution

airoboros-2.0.12-py3-none-any.whl (55.7 kB view details)

Uploaded Python 3

File details

Details for the file airoboros-2.0.12.tar.gz.

File metadata

  • Download URL: airoboros-2.0.12.tar.gz
  • Upload date:
  • Size: 39.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.17

File hashes

Hashes for airoboros-2.0.12.tar.gz
Algorithm Hash digest
SHA256 8536852d826981c11cf1f06687f8b0655409786af2d246d220e6175fa7618b09
MD5 1e630c154c555e3c20d632a46dcb4070
BLAKE2b-256 2406c45077e2cc1675343c436f4332b27a46a8757ffee9d5c093e9bdd1aebc62

See more details on using hashes here.

File details

Details for the file airoboros-2.0.12-py3-none-any.whl.

File metadata

  • Download URL: airoboros-2.0.12-py3-none-any.whl
  • Upload date:
  • Size: 55.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.17

File hashes

Hashes for airoboros-2.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 71e9ad9a40bc90bec8e15e845a64b4c8d3091a4781c8196308f6349c76898680
MD5 4ff4e23991be10f497a0702ee6ba130f
BLAKE2b-256 3a8e3a5249d6e29894d8ac153aafebde8e6dda8a93b7e741e49ede5f3dae2bfb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page