A functional programming interface for building AI systems
Project description
λprompt - Turn prompts into functions
UNDER CONSTRUCTION
pip install lambdaprompt
lambdaprompt is a python package, ...
- minimalistic API
- functional helpers
- create complex and emergent behavior
For using openAI, set up API keys as environment variables or set after importing (also easy to just make a .env
file, since this uses dotenv
package)
OPENAI_API_KEY=...
import lambdaprompt; lambdaprompt.setup(openai_api_key=’...’)
Creating a prompt
Prompts use JINJA templating to create a string, the string is passed to the LLM for completion.
from lambdaprompt import asyncGPT3Prompt
deeper_website_choice = asyncGPT3Prompt(
"deeper_website_choice",
"""
For the question [{{ question }}], the search results are
{{ search_results }}
In order to answer the question, which three page indices (0-9 from above) should be further investigated? (eg. [2, 7, 9])
[""",
stop="]",
)
You can also turn any function into a prompt (useful for composing prompts, or creating programs out of prompts.
from lambdaprompt import prompt
@prompt
def standard_function(text_input):
res = is_a_question(text_input)
if res.lower().startswith('yes'):
return answer_the_question(text_input)
else:
return "That was not a question, please try again"
Using a prompt
await deeper_website_choice(question="What is the capital of France?", search_results="0: result 0, 1: france, 2: another thing, ...")
Some special properties
- For prompts with only a single variable, can directly call with the variable as args (no need to define in kwarg)
basic_qa = asyncGPT3Prompt("basic_qa", """What is the answer to the question [{{ question }}]?""")
await basic_qa("Is it safe to eat pizza with chopsticks?")
- You can use functional primatives to create more complex prompts
print(*map(basic_qa, ["Is it safe to eat pizza with chopsticks?", "What is the capital of France?"]))
- You can apply these to pandas dataframes to do analytics quickly using LLMs
import pandas as pd
from lambdaprompt import GPT3Prompt
df = pd.DataFrame({'country': ["France", "Japan", "USA"]})
df['capitals'] = df.apply(GPT3Prompt("basic_qa", """What is the capital of {{ country }}?"""), axis=1)
Bonus
There is also a GPT3Edit
class, that can be used to edit text.
Advanced usage
Pre-and-post call hooks (tracing and logging) --> This is just on all the time right now... it makes a sqlitedb.
lambdaprompt.register(pre=print, post=print)
Lightweight web-server for calling prompts (useful for attaching to JS webapps)
uvicorn lambdaprompt.promptapi:app --reload
Design Patterns
- Response Optimization
- Summarization and Aggregations
- Meta-Prompting
Contributions are welcome
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for lambdaprompt-0.0.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b2c8e9781b9936eadabd519f8599dd8b41cca32d2e9bd37d3690b8ff57891004 |
|
MD5 | ba9f0fba3a328e07048965756403537b |
|
BLAKE2b-256 | 835c44ee249bc2c8a96793d013b9f823541506857cb16144a174b65959bb8f9d |