Skip to main content

A easy way to create structured AI agents

Project description

Rootflo

Composable AI Agentic Workflow

Rootflo is an alternative to Langgraph, and CrewAI. It lets you easily build composable agentic workflows from using simple components to any size, unlocking the full potential of LLMs.

GitHub stars GitHub release (latest) GitHub commit activity License


Checkout the docs »

GithubWebsiteRoadmap


Flo AI 🌊

Build production-ready AI agents and teams with minimal code

Flo AI is a Python framework that makes building production-ready AI agents and teams as easy as writing YAML. Think "Kubernetes for AI Agents" - compose complex AI architectures using pre-built components while maintaining the flexibility to create your own.

✨ Features

  • 🔌 Truly Composable: Build complex AI systems by combining smaller, reusable components
  • 🏗️ Production-Ready: Built-in best practices and optimizations for production deployments
  • 📝 YAML-First: Define your entire agent architecture in simple YAML
  • 🔧 Flexible: Use pre-built components or create your own
  • 🤝 Team-Oriented: Create and manage teams of AI agents working together
  • 📚 RAG Support: Built-in support for Retrieval-Augmented Generation
  • 🔄 Langchain Compatible: Works with all your favorite Langchain tools

🚀 Quick Start

FloAI follows an agent team architecture, where agents are the basic building blocks, and teams can have multiple agents and teams themselves can be part of bigger teams.

Building a working agent or team involves 3 steps:

  1. Create a session using FloSession, and register your tools and models
  2. Define you agent/team/team of teams using yaml or code
  3. Build and run using Flo

Installation

pip install flo-ai
# or using poetry
poetry add flo-ai

Create Your First AI Agent in 30 secs

from flo_ai import Flo, FloSession
from langchain_openai import ChatOpenAI
from langchain_community.tools.tavily_search.tool import TavilySearchResults

# init your LLM
llm = ChatOpenAI(temperature=0)

# create a session and register your tools
session = FloSession(llm).register_tool(name="TavilySearchResults", tool=TavilySearchResults())

# define your agent yaml
simple_weather_checking_agent = """
apiVersion: flo/alpha-v1
kind: FloAgent
name: weather-assistant
agent:
    name: WeatherAssistant
    job: >
      Given the city name you are capable of answering the latest whether this time of the year by searching the internet
    tools:
      - name: InternetSearchTool
"""
flo = Flo.build(session, yaml=simple_weather_checking_agent)

# Start streaming results
for response in flo.stream("Write about recent AI developments"):
    print(response)

Lets create the same agent using code

from flo_ai import FloAgent

session = FloSession(llm)

weather_agent = FloAgent.create(
    session=session,
    name="WeatherAssistant",
    job="Given the city name you are capable of answering the latest whether this time of the year by searching the internet",
    tools=[TavilySearchResults()]
)

agent_flo: Flo = Flo.create(session, weather_agent)
result = agent_flo.invoke("Whats the whether in New Delhi, India ?")

Create Your First AI Team in 30 Seconds

from flo_ai import Flo, FloSession
from langchain_openai import ChatOpenAI
from langchain_community.tools.tavily_search.tool import TavilySearchResults


# Define your team in YAML
yaml_config = """
apiVersion: flo/alpha-v1
kind: FloRoutedTeam
name: research-team
team:
    name: ResearchTeam
    router:
        name: TeamLead
        kind: supervisor
    agents:
      - name: Researcher
        role: Research Specialist
        job: Research latest information on given topics
        tools:
          - name: TavilySearchResults
      - name: Writer
        role: Content Creator
        job: Create engaging content from research
"""

# Set up and run
llm = ChatOpenAI(temperature=0)
session = FloSession(llm).register_tool(name="TavilySearchResults", tool=TavilySearchResults())
flo = Flo.build(session, yaml=yaml_config)

# Start streaming results
for response in flo.stream("Write about recent AI developments"):
    print(response)

Note: You can make each of the above agents including the router to different models, giving flexibility to combine the power of different LLMs. To know more, check multi-model integration in detailed documentation

Lets Create a AI team using code

from flo_ai import FloSupervisor, FloAgent, FloSession, FloTeam, FloLinear
from langchain_openai import ChatOpenAI
from langchain_community.tools.tavily_search.tool import TavilySearchResults

llm = ChatOpenAI(temperature=0, model_name='gpt-4o')
session = FloSession(llm).register_tool(
    name="TavilySearchResults",
    tool=TavilySearchResults()
)

researcher = FloAgent.create(
    session,
    name="Researcher", 
    role="Internet Researcher", # optional
    job="Do a research on the internet and find articles of relevent to the topic asked by the user", 
    tools=[TavilySearchResults()]
)

blogger = FloAgent.create(
    session, 
    name="BlogWriter", 
    role="Thought Leader", # optional
    job="Able to write a blog using information provided", 
    tools=[TavilySearchResults()]
)

marketing_team = FloTeam.create(session, "Marketing", [researcher, blogger])
head_of_marketing = FloSupervisor.create(session, "Head-of-Marketing", marketing_team)
marketing_flo = Flo.create(session, routed_team=head_of_marketing)

Tools

FloAI supports all the tools built and available in langchain_community package. To know more these tools, go here.

Along with that FloAI has a decorator @flotool which makes any function into a tool.

Creating a simple tool using @flotool:

from flo_ai import flotool
from pydantic import BaseModel, Field

# define argument schema
class AdditionToolInput(BaseModel):
    numbers: List[int] = Field(..., description='List of numbers to add')

@flotool(name='AdditionTool', description='Tool to add numbers')
async def addition_tool(numbers: List[int]) -> str:
    result = sum(numbers)
    await asyncio.sleep(1)
    return f'The sum is {result}'

# async tools can also be defined
# when using async tool, while running the flo use async invoke
@flotool(
    name='MultiplicationTool',
    description='Tool to multiply numbers to get product of numbers',
)
async def mul_tool(numbers: List[int]) -> str:
    result = sum(numbers)
    await asyncio.sleep(1)
    return f'The product is {result}'

# register your tool or use directly in code impl
session.register_tool(name='Adder', tool=addition_tool)

Note: @flotool comes with inherent error handling capabilities to retry if an exception is thrown. Use unsafe=True to disable error handling

📖 Documentation

Visit our comprehensive documentation for:

  • Detailed tutorials
  • Architecture deep-dives
  • API reference
    • Logging
    • Error handling
    • Observers
    • Dynamic model switching
  • Best practices
  • Advanced examples

🌟 Why Flo AI?

For AI Engineers

  • Faster Development: Build complex AI systems in minutes, not days
  • Production Focus: Built-in optimizations and best practices
  • Flexibility: Use our components or build your own

For Teams

  • Maintainable: YAML-first approach makes systems easy to understand and modify
  • Scalable: From single agents to complex team hierarchies
  • Testable: Each component can be tested independently

🎯 Use Cases

  • 🤖 Customer Service Automation
  • 📊 Data Analysis Pipelines
  • 📝 Content Generation
  • 🔍 Research Automation
  • 🎯 Task-Specific AI Teams

🤝 Contributing

We love your input! Check out our Contributing Guide to get started. Ways to contribute:

  • 🐛 Report bugs
  • 💡 Propose new features
  • 📝 Improve documentation
  • 🔧 Submit PRs

📜 License

Flo AI is MIT Licensed.

🙏 Acknowledgments

Built with ❤️ using:

📚 Latest Blog Posts


Built with ❤️ by the rootflo team
CommunityDocumentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flo_ai-0.0.4rc2.tar.gz (30.1 kB view details)

Uploaded Source

Built Distribution

flo_ai-0.0.4rc2-py3-none-any.whl (39.9 kB view details)

Uploaded Python 3

File details

Details for the file flo_ai-0.0.4rc2.tar.gz.

File metadata

  • Download URL: flo_ai-0.0.4rc2.tar.gz
  • Upload date:
  • Size: 30.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.4 Darwin/22.6.0

File hashes

Hashes for flo_ai-0.0.4rc2.tar.gz
Algorithm Hash digest
SHA256 bb7e623508d72bee9defbb7427bd046978d5de07cb9d5e43af7039b98d474b3b
MD5 4d5faf08f89a3b5d428da5d2affacdf0
BLAKE2b-256 9883d345b40f6e19e324e4a81b98c46264cb9ee23efc83f81c0162239cd3d50e

See more details on using hashes here.

File details

Details for the file flo_ai-0.0.4rc2-py3-none-any.whl.

File metadata

  • Download URL: flo_ai-0.0.4rc2-py3-none-any.whl
  • Upload date:
  • Size: 39.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.4 Darwin/22.6.0

File hashes

Hashes for flo_ai-0.0.4rc2-py3-none-any.whl
Algorithm Hash digest
SHA256 7a9d28015d371693da6a1b747fde961f9e58d807edcd8e3ee0f1a49cc11d08ba
MD5 1c80621d31e085c8c21fd83814e6a681
BLAKE2b-256 2d1fbf6849a058bba0f3388a631ca63a0ba63226b53f3c1b782ae769f0e2f8ac

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page