A functional programming interface for building AI systems
Project description
λprompt - Functional programming interface for building AI systems
lambdaprompt is a python package, ...
Getting started
Install pip install lambdaprompt
For using openAI, set up API keys as environment variables or set after importing
OPENAI_API_KEY=...
import lambdaprompt; lambdaprompt.setup(openai_api_key=’...’)
Try it out in colab [link]
Library demos and examples
[[ See Here ]]
How to
Map
prompt.map([“yes”, “no”])
Reduce
prompt.reduce(...)
Composition
…
For Pandas Users (Useful for data processing)
df.prompt.apply(...)
Making Prompts
LLM JINJA templates
prompt = LLM(“goal of prompt”, “””
{{ template }}
…
“””
Decorator
@promptify
def excalamation(arg):
return arg+"!"*10
Advanced usage
Pre-and-post call hooks (tracing and logging) [see example]
lambdaprompt.register(pre=print, post=print)
Design Patterns
- Response Optimization
- Summarization and Aggregations
- Meta-Prompting
Contributions are welcome
To add: An issue template
To add: A pull request template
TODO: Check all dependent prompts in the library via signature-check are correct This ensures that when someone changes an upstream prompt, they must at least see all dependent prompts that they should update.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
lambdaprompt-0.0.0.tar.gz
(2.6 kB
view hashes)
Built Distribution
Close
Hashes for lambdaprompt-0.0.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 81013cde3d90721d822d3f15dfa392d942c2abb2c04228cd03372e82d20d2794 |
|
MD5 | 011465a68ba54ae98004584e5917ba4b |
|
BLAKE2b-256 | 1009c39c06fd3dbb91cb64420e640474c58be59b01724dcffc16a2f3b8affef6 |