A functional programming interface for building AI systems
Project description
λprompt - Turn prompts into functions
pip install lambdaprompt
lambdaprompt is a python package, ...
- minimalistic API
- functional helpers
- create complex and emergent behavior
For using openAI, set up API keys as environment variables or set after importing (also easy to just make a .env
file, since this uses dotenv
package)
OPENAI_API_KEY=...
Creating a prompt
Prompts use JINJA templating to create a string, the string is passed to the LLM for completion.
from lambdaprompt import GPT3Prompt
first = GPT3Prompt("Sally had {{ number }} of {{ thing }}. Sally sold ")
# then use it as a function
first(number=12, thing="apples")
You can also turn any function into a prompt (useful for composing prompts, or creating programs out of prompts.
from lambdaprompt import prompt
@prompt
def standard_function(text_input):
res = is_a_question(text_input)
if res.lower().startswith('yes'):
return answer_the_question(text_input)
else:
return "That was not a question, please try again"
Using a prompt -- just call like a function
first(number=12, thing="apples")
some examples
>>> from lambdaprompt.gpt3 import GPT3Prompt, GPT3Edit, AsyncGPT3Edit, AsyncGPT3Prompt
>>> first = GPT3Prompt("Sally had {{ number }} of {{ thing }}. Sally sold ")
>>> first(number=12, thing="apples")
' 8 of the apples.\n\nSally now has 4 apples.'
wow = AsyncGPT3Edit("Turn this into a {{ joke_style }} joke")
await wow(joke_style="american western", input="Sally ate a lot of food")
'Sally ate a lot of food.\nShe was a cowgirl.\n'
Some special properties
- For prompts with only a single variable, can directly call with the variable as args (no need to define in kwarg)
basic_qa = asyncGPT3Prompt("basic_qa", """What is the answer to the question [{{ question }}]?""")
await basic_qa("Is it safe to eat pizza with chopsticks?")
- You can use functional primatives to create more complex prompts
print(*map(basic_qa, ["Is it safe to eat pizza with chopsticks?", "What is the capital of France?"]))
IN PROGRESS (I think this doesn't work as expected yet...)
- You can apply these to pandas dataframes to do analytics quickly using LLMs
import pandas as pd
from lambdaprompt import GPT3Prompt
df = pd.DataFrame({'country': ["France", "Japan", "USA"]})
df['capitals'] = df.apply(GPT3Prompt("""What is the capital of {{ country }}?"""), axis=1)
Advanced usage
Pre-and-post call hooks (tracing and logging)
lambdaprompt.register_callback(lambda *x: print(x))
Design Patterns
- Response Optimization
- Summarization and Aggregations
- Meta-Prompting
Contributions are welcome
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
lambdaprompt-0.1.0.tar.gz
(1.5 MB
view hashes)
Built Distribution
Close
Hashes for lambdaprompt-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 11a1d503f69370f46718b468c3f07e84fe701d2be007b5f4ade0f2f0e118a5c9 |
|
MD5 | 7998fefa389217860d115c2bcccacace |
|
BLAKE2b-256 | 3410ad602b10505f3fca9f1d57bf1aa2c98aae637c8fe9a57c09f1d057535703 |