Easily integrate LLMs into Python code.
Project description
magic-lamp
Easily integrate LLM calls into Python code. By default, uses a local LLM.
Quickstart
pip install magic-lamp
Define a function with a description and a set of examples.
import magic_lamp
format_name = magic_lamp.Function(
'Format this surname in a way that it would be written out.',
examples=[
("PERALTA", "Peralta"),
("OCONNELL", "O'Connel"),
("MCDONALD", "McDonald")
],
)
print(format_name("MCDOWELL"))
Configuring the LLM
By default, magic-lamp
downloads and runs a local LLM from Hugging Face. For more complex tasks, OpenAI models will perform better.
Using OpenAI
OPENAI_API_KEY
must be set in the environment. Pass in the name of a gpt-*
model to the function constructor.
import magic_lamp
format_number = magic_lamp.Function(
'Write this number out in words.',
examples=[
("1", "one"),
("35", "thirty-five"),
("15690", "fifteen thousand, six hundred ninety"),
],
model="gpt-4o-mini"
)
print(format_number("328745226793"))
Links
- https://github.com/jackmpcollins/magentic - A similar concept but using decorators.
- https://github.com/abetlen/llama-cpp-python - Used by this library to run the local LLM.
- https://ai.meta.com/blog/meta-llama-3-1/ - Llama 3.1 8b is the default model.
- https://huggingface.co/bullerwins/Meta-Llama-3.1-8B-Instruct-GGUF - Uses these GGUFs by default.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
magic_lamp-0.3.0.tar.gz
(3.4 kB
view hashes)
Built Distribution
Close
Hashes for magic_lamp-0.3.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 67c45a3ee8f1943f0992f6182bfa9f30a4c8c73a9fc58f6c76ef0d554281cf24 |
|
MD5 | a3a115da6397cda338c57512d4a88053 |
|
BLAKE2b-256 | 6f207506ca8ae4b7008b1b22c8802ce58df101de3e0b353d3a38b8f1990b4960 |