Skip to main content

A functional programming interface for building AI systems

Project description

λprompt - Turn prompts into functions

UNDER CONSTRUCTION λprompt is under construction

pip install lambdaprompt

lambdaprompt is a python package, ...

  • minimalistic API
  • functional helpers
  • create complex and emergent behavior

For using openAI, set up API keys as environment variables or set after importing (also easy to just make a .env file, since this uses dotenv package)

OPENAI_API_KEY=...

import lambdaprompt; lambdaprompt.setup(openai_api_key=’...’)

Creating a prompt

Prompts use JINJA templating to create a string, the string is passed to the LLM for completion.

from lambdaprompt import asyncGPT3Prompt

deeper_website_choice = asyncGPT3Prompt(
    "deeper_website_choice",
    """
For the question [{{ question }}], the search results are
{{ search_results }}
In order to answer the question, which three page indices (0-9 from above) should be further investigated? (eg. [2, 7, 9])
[""",
    stop="]",
)

You can also turn any function into a prompt (useful for composing prompts, or creating programs out of prompts.

from lambdaprompt import prompt

@prompt
def standard_function(text_input):
    res = is_a_question(text_input)
    if res.lower().startswith('yes'):
        return answer_the_question(text_input)
    else:
        return "That was not a question, please try again"

Using a prompt

await deeper_website_choice(question="What is the capital of France?", search_results="0: result 0, 1: france, 2: another thing, ...")

Some special properties

  1. For prompts with only a single variable, can directly call with the variable as args (no need to define in kwarg)
basic_qa = asyncGPT3Prompt("basic_qa", """What is the answer to the question [{{ question }}]?""")

await basic_qa("Is it safe to eat pizza with chopsticks?")
  1. You can use functional primatives to create more complex prompts
print(*map(basic_qa, ["Is it safe to eat pizza with chopsticks?", "What is the capital of France?"]))
  1. You can apply these to pandas dataframes to do analytics quickly using LLMs
import pandas as pd
from lambdaprompt import GPT3Prompt

df = pd.DataFrame({'country': ["France", "Japan", "USA"]})
df['capitals'] = df.apply(GPT3Prompt("basic_qa", """What is the capital of {{ country }}?"""), axis=1)

Bonus

There is also a GPT3Edit class, that can be used to edit text.

Advanced usage

Pre-and-post call hooks (tracing and logging) --> This is just on all the time right now... it makes a sqlitedb.

lambdaprompt.register(pre=print, post=print)

Lightweight web-server for calling prompts (useful for attaching to JS webapps)

uvicorn lambdaprompt.promptapi:app --reload

Design Patterns

Contributions are welcome

Contributing

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lambdaprompt-0.0.2.tar.gz (1.5 MB view hashes)

Uploaded Source

Built Distribution

lambdaprompt-0.0.2-py3-none-any.whl (18.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page