Skip to main content

A functional programming interface for building AI systems

Project description

λprompt - Turn prompts into functions

pip install lambdaprompt

lambdaprompt is a python package, ...

  • minimalistic API
  • functional helpers
  • create complex and emergent behavior
  • use as a webserver, easily host prompts as HTTP endpoints

For using openAI, set up API keys as environment variables or set after importing (also easy to just make a .env file, since this uses dotenv package)

OPENAI_API_KEY=...

Creating a prompt

Prompts use JINJA templating to create a string, the string is passed to the LLM for completion.

from lambdaprompt import GPT3Prompt

first = GPT3Prompt("Sally had {{ number }} of {{ thing }}. Sally sold ")
# then use it as a function
first(number=12, thing="apples")

You can also turn any function into a prompt (useful for composing prompts, or creating programs out of prompts.

from lambdaprompt import prompt

@prompt
def standard_function(text_input):
    res = is_a_question(text_input)
    if res.lower().startswith('yes'):
        return answer_the_question(text_input)
    else:
        return "That was not a question, please try again"

Using a prompt -- just call like a function

first(number=12, thing="apples")

some examples

>>> from lambdaprompt.gpt3 import GPT3Prompt, GPT3Edit, AsyncGPT3Edit, AsyncGPT3Prompt
>>> first = GPT3Prompt("Sally had {{ number }} of {{ thing }}. Sally sold ")
>>> first(number=12, thing="apples")
' 8 of the apples.\n\nSally now has 4 apples.'
wow = AsyncGPT3Edit("Turn this into a {{ joke_style }} joke")
await wow(joke_style="american western", input="Sally ate a lot of food")
'Sally ate a lot of food.\nShe was a cowgirl.\n'

Some special properties

  1. For prompts with only a single variable, can directly call with the variable as args (no need to define in kwarg)
basic_qa = asyncGPT3Prompt("basic_qa", """What is the answer to the question [{{ question }}]?""")

await basic_qa("Is it safe to eat pizza with chopsticks?")
  1. You can use functional primatives to create more complex prompts
print(*map(basic_qa, ["Is it safe to eat pizza with chopsticks?", "What is the capital of France?"]))

Using lambdaprompt as a webservice

make a file

app.py

from lambdaprompt import AsyncGPT3Prompt, prompt
from lambdaprompt.server.main import app

AsyncGPT3Prompt(
    """Rewrite the following as a {{ target_author }}. 
```
{{ source_text }}
```
Output:
```
""",
    name="rewrite_as",
    stop="```",
)

Then run

uvicorn app:app --reload

browse to http://localhost:8000/docs to see the swagger docs generated for the prompts!

IN PROGRESS (I think this doesn't work as expected yet...)

  1. You can apply these to pandas dataframes to do analytics quickly using LLMs
import pandas as pd
from lambdaprompt import GPT3Prompt

df = pd.DataFrame({'country': ["France", "Japan", "USA"]})
df['capitals'] = df.apply(GPT3Prompt("""What is the capital of {{ country }}?"""), axis=1)

Advanced usage

Pre-and-post call hooks (tracing and logging)

lambdaprompt.register_callback(lambda *x: print(x))

Running lambdaprompt webserver (example, for dev)

Built on fastapi, includes a simple dockerfile here too

docker build -t lambdaprompt:latest . --build-arg mode=dev
docker run --it -v $(pwd):/code -p 4412:80 lambdaprompt:latest

For prod build

docker build -t lambdaprompt:latest .

Design Patterns

Contributions are welcome

Contributing

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lambdaprompt-0.2.1.tar.gz (1.5 MB view hashes)

Uploaded Source

Built Distribution

lambdaprompt-0.2.1-py3-none-any.whl (3.8 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page