A simple and lightweight library for creating LLM-powered applications.
Project description
Installation
pip install llmflow
What is LLMFlow?
A simple and lightweitght library for creating LLM-powered applications.
Philosophy
Simple
We have created LLMFlow with simplicity in mind while balancing for usability. We have a minimal set of classes that allow users to build powerful LLM-powered apps without compromising on capabilities.
Explicit
We allow users to easily create complex LLM flows but we don't
Transparent
We don't have classes with hidden prompts - prompts are for you to decide.
Example
Here is a basic example for creating an LLM with PromptTemplate:
from llmflow.llms.openai import OpenAI
from llmflow.prompts.prompt_template import PromptTemplate
prompt_template = PromptTemplate(
prompt="Generate a title for a 90s hip-hop song about {topic}."
)
llm_prompt = prompt_template.get_prompt(topic="friendship")
print(llm_prompt)
llm = OpenAI()
song_title = llm.generate(llm_prompt)
print(song_title)
While this is a good example on how easy it is to use LLMFlow, real-world applications are more complex and have dependencies between prompts, and LLMs outputs. Let's take a look at such example. With LLMFlow it's quite easy to reproduce this flow by utilizing the Flow and Flowstep classes. LLMFlow will figure out the dependencies for you and make sure each flowstep is executed only when the flowsteps it depends on are complete:
from llmflow.flows.flow import Flow
from llmflow.flows.flowstep import FlowStep
from llmflow.llms.openai import OpenAI
from llmflow.prompts.prompt_template import PromptTemplate
# Create LLM
open_ai_llm = OpenAI()
# Create prompt templates
title_template = PromptTemplate("What is a good title of a movie about {topic}?")
song_template = PromptTemplate(
"What is a good song title of a soundtrack for a movie called {movie_title}?"
)
characters_template = PromptTemplate(
"What are two main characters for a movie called {movie_title}?"
)
lyrics_template = PromptTemplate(
"Write lyrics of a movie song called {song_title}. The main characters are"
" {main_characters}"
)
# Create flowsteps
flowstep1 = FlowStep(
name="Flowstep 1",
llm=open_ai_llm,
prompt_template=title_template,
output_key="movie_title",
)
flowstep2 = FlowStep(
name="Flowstep 2",
llm=open_ai_llm,
prompt_template=song_template,
output_key="song_title",
)
flowstep3 = FlowStep(
name="Flowstep 3",
llm=open_ai_llm,
prompt_template=characters_template,
output_key="main_characters",
)
flowstep4 = FlowStep(
name="Flowstep 4",
llm=open_ai_llm,
prompt_template=lyrics_template,
output_key="song_lyrics",
)
# Connect flowsteps
flowstep1.connect(flowstep2, flowstep3, flowstep4)
flowstep2.connect(flowstep4)
flowstep3.connect(flowstep4)
# Create and run Flow
soundtrack_flow = Flow(flowstep1)
soundtrack_flow.execute(topic="friendship")
In fact, LLMFlow provides async, and threaded classes so any complex DAG can be executed in parallel. For more examples such as how to create Q&A apps and web applications with Flask and FastAPI check our documentation.
Usage patterns
-
Use LLMs-only
Use the LLM and Agents classes and tools to build free-roaming agents
-
Create Flows
Use the LLM and Agents classes and tools to build free-roaming agents
-
Build web apps
Execute agents inside flowsteps where you can build conversational experiences with different stages and requirements
FAQ
How is this different than langchian?
Langchain is an amazing library and LLMFlow has been certainly been inspired by it. However, our philosophy is a bit different. Langchian has a "chain for everything" philosophy and while this is great for beginners and people who are not in the field of machine learning and language models, this can be a bit overwhelming. There are a lot of classes, hidden prompts in chains and many ways to do things. In contrast we are focusing on providing as few building blocks as possible and having an easy to understand API while matching (and in some cases exceeding) the capabilities of langchain.
You only have OpenAI wrappers but I want to use ACMELLM
We decided to release the library initially supporting only OpenAI LLMs but we have a roadmap and we will slowly add new wrappers around the most popular models. If you are willing to spend some time we are looking for contributors and maintainers
You only support Pinecone and Redis vector DBs do you have plans to extend the list?
Yes! We will also add Redis, Elastic Search and other popular solutions over time. If you want to help us out check out our contribution section.
Why can't I find any info related to document loaders
For the time being we have decided not to implement document loaders for two reasons:
- There are plenty of super capable libraries like Llama-index and langchain that have tons of loaders.
- We think it is awkward to mix document loaders together with LLM flow and prompt management libraries since usually document loading happens in separate pipelines and is not part of the LLM-powered app.
While we are not going to invest time into document loaders we might decide to change direction if we get significant interest and contributors.
What about agents?
We believe agents are the future of LLM-powered apps and we have a few basic examples in the repo. However, we are working on a agent-focused library built on top of llmflow.
Contributing
Links
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.