Description
Project description
EasyLLM -
EasyLLM is an open source project that provides helpful tools and methods for working with large language models (LLMs), both open source and closed source. Get immediataly started or check out the documentation.
EasyLLM implements clients that are compatible with OpenAI's Completion API. This means you can easily replace openai.ChatCompletion
, openai.Completion
, openai.Embedding
with, for example, huggingface.ChatCompletion
, huggingface.Completion
or huggingface.Embedding
by changing one line of code.
Supported Clients
huggingface
- HuggingFace modelshuggingface.ChatCompletion
- Chat with LLMshuggingface.Completion
- Text completion with LLMshuggingface.Embedding
- Create embeddings with LLMs
Check out the Examples to get started.
🚀 Getting Started
Install EasyLLM via pip:
pip install easyllm
Then import and start using the clients:
from easyllm.clients import huggingface
# helper to build llama2 prompt
huggingface.prompt_builder = "llama2"
response = huggingface.ChatCompletion.create(
model="meta-llama/Llama-2-70b-chat-hf",
messages=[
{"role": "system", "content": "\nYou are a helpful assistant speaking like a pirate. argh!"},
{"role": "user", "content": "What is the sun?"},
],
temperature=0.9,
top_p=0.6,
max_tokens=256,
)
print(response)
the result will look like
{
"id": "hf-lVC2iTMkFJ",
"object": "chat.completion",
"created": 1690661144,
"model": "meta-llama/Llama-2-70b-chat-hf",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": " Arrrr, the sun be a big ol' ball o' fire in the sky, me hearty! It be the source o' light and warmth for our fair planet, and it be a mighty powerful force, savvy? Without the sun, we'd be sailin' through the darkness, lost and cold, so let's give a hearty \"Yarrr!\" for the sun, me hearties! Arrrr!"
},
"finish_reason": null
}
],
"usage": {
"prompt_tokens": 111,
"completion_tokens": 299,
"total_tokens": 410
}
}
Check out other examples:
- Detailed ChatCompletion Example
- Example how to stream chat requests
- Example how to stream text requests
- Detailed Completion Example
- Create Embeddings
See the documentation for more detailed usage and examples.
💪🏻 Migration from OpenAI to HuggingFace
Migrating from OpenAI to HuggingFace is easy. Just change the import statement and the client you want to use and optionally the prompt builder.
- import openai
+ from easyllm.clients import huggingface
+ huggingface.prompt_builder = "llama2"
- response = openai.ChatCompletion.create(
+ response = huggingface.ChatCompletion.create(
- model="gpt-3.5-turbo",
+ model="meta-llama/Llama-2-70b-chat-hf",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Knock knock."},
],
)
Make sure when you switch your client that your hyperparameters are still valid. For example, temperature
of GPT-3 might be different than temperature
of Llama-2
.
☑️ Key Features
🤝 Compatible Clients
- Implementation of clients compatible with OpenAI API format of
openai.ChatCompletion
,openai.Completion
,openai.Embedding
. - Easily switch between different LLMs like
openai.ChatCompletion
andhuggingface.ChatCompletion
by changing one line of code. - Support for streaming of completions, checkout example How to stream completions.
⚙️ Helper Modules ⚙️
-
evol_instruct
(work in progress) - Use evolutionary algorithms create instructions for LLMs. -
prompt_utils
- Helper methods to easily convert between prompt formats like OpenAI Messages to prompts for open source models like Llama 2.
🙏 Contributing
EasyLLM is an open source project and welcomes contributions of all kinds.
The project uses hatch for development. To get started, fork the repository and clone it to your local machine.
- Confirm hatch is installed (pipx is great to make it available globally on your machine)
- Once in the project directory, run
hatch env create
to create a default virtual environment for development. - Activate the virtual environment with
hatch shell
- Start developing! 🤩
📔 Citation & Acknowledgements
If you use EasyLLM, please share it with me on social media or email. I would love to hear about it! You can also cite the project using the following BibTeX:
@software{Philipp_Schmid_EasyLLM_2023,
author = {Philipp Schmid},
license = {Apache-2.0},
month = juj,
title = {EasyLLM: Streamlined Tools for LLMs},
url = {https://github.com/philschmid/easyllm},
year = {2023}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file easyllm-0.4.0.tar.gz
.
File metadata
- Download URL: easyllm-0.4.0.tar.gz
- Upload date:
- Size: 47.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.11.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5d68461f6247267cc51505f982788889cec87b3beac74dfd5fc662f57f5e19c9 |
|
MD5 | f66b0f41ed9fec931f6f517cba118b8b |
|
BLAKE2b-256 | f6e04be805d380831168aacfa8272c2955f2c41a50a58a347b29fc30f608e3d7 |
File details
Details for the file easyllm-0.4.0-py3-none-any.whl
.
File metadata
- Download URL: easyllm-0.4.0-py3-none-any.whl
- Upload date:
- Size: 23.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.11.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 99f433daff0d38f6c2b40fb60bc9ba7502c5406184913eac0d7eb5b37e3a1d15 |
|
MD5 | d2731ddaaad4a4c1cdd71014e7385eae |
|
BLAKE2b-256 | c55540d8f7270efde507440225c10c42cbd0b1fb2a956d5370de3a0e72cb19aa |