A library for distilling models from prompts.
Project description
Prompt2Model - Generate Deployable Models from Instructions
Prompt2Model
is a system that takes a natural
language task description (like the prompts used for
LLMs such as ChatGPT) to train a small
special-purpose model that is conducive for deployment.
Quick Start
Notebook
You can run our demo of Prompt2Model
through a notebook:
Command Line
You can also run through the command line.
pip install prompt2model
Prompt2Model
supports various platforms such as OpenAI, Anthropic, Huggingface, etc. using LiteLLM.
If you are using OpenAI models (such as the default gpt-3.5-turbo
), please obtain an
OpenAI API key on their website then set
the environment variable OPENAI_API_KEY
to your API key by running
the following command in your terminal:
export OPENAI_API_KEY=<your key>
List of all supported providers
You can then run
python prompt2model_demo.py
to create a small model from a prompt, as shown in the demo video below. This script must be run on a device with an internet connection to access the OpenAI API. For best results, run this script on a device with a GPU for training your model.
Demo
https://github.com/neulab/prompt2model/assets/2577384/8d73394b-3028-4a0b-bdc3-c127082868f2
Tips and Examples to Write a Good Prompt
You can see the tips and examples to write a good prompt in prompt_examples.
Components
The prompt2model
package is composed
of several components, each designed
to fulfill a specific purpose. To gain
a comprehensive understanding of how to
utilize each component effectively,
please consult the readme.md
file
situated in the directory of the respective
component. These files can be found at
./prompt2model/<component>/readme.md
.
They provide detailed information and
instructions on customizing and maximizing
the functionality of each
component within the package.
Contribution
If you're interested in contributing to the prompt2model
project, please
- refer to CONTRIBUTING.md
- open an issue or submit a PR
- join us on discord
- or reach out to @vijaytarian and @Chenan3_Zhao on Twitter
Cite
We have written a paper describing Prompt2Model in detail.
If you use Prompt2Model in your research, please cite our paper:
@misc{prompt2model,
title={Prompt2Model: Generating Deployable Models from Natural Language Instructions},
author={Vijay Viswanathan and Chenyang Zhao and Amanda Bertsch and Tongshuang Wu and Graham Neubig},
year={2023},
eprint={2308.12261},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file prompt2model-0.0.9.tar.gz
.
File metadata
- Download URL: prompt2model-0.0.9.tar.gz
- Upload date:
- Size: 63.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 26b0efd134fb63a35be38bff8378244a2add76a0de056aff3f7d810089212f4a |
|
MD5 | c06befe21aa1cb13aba56d8002ad85bd |
|
BLAKE2b-256 | 90583d31db61b4f111a5873b13fa77e7d8193f022290868193cdb087767f8b4b |
File details
Details for the file prompt2model-0.0.9-py3-none-any.whl
.
File metadata
- Download URL: prompt2model-0.0.9-py3-none-any.whl
- Upload date:
- Size: 89.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 36964c7a6eb80050a955c3bf134636a8dfc5fe79d9ee8d2d400dbd5cf53ffa44 |
|
MD5 | c7b7e775924180e562052fa187949c37 |
|
BLAKE2b-256 | 48d660d7382991a4540219bda8b4ffd15cc3709ead9be7b7b389235e9c2090bf |