Skip to main content

An agent-based orchestration framework for generative AI workflows

Project description

Genaitor Logo

GenAItor

A platform for AI Agents and AI Agents products generation.

Overview

GenAItor is a cutting-edge platform designed to generate AI agents and related products that help automate complex tasks and processes. It leverages state-of-the-art machine learning libraries and tools to deliver flexible and scalable AI solutions.

To install the required dependencies, follow these steps:

  1. Clone the repository:

    git clone https://github.com/enterpriselm/genaitor.git
    cd genaitor
    
  2. Create a virtual environment (optional but recommended):

    python -m venv venv
    source venv/bin/activate  # On Windows use `venv\Scripts\activate`
    
  3. Important:
    This project is best run with Python 3.12 to ensure compatibility and avoid potential errors.

  4. Install the required packages:

    pip install -e .
    
  5. Add API_KEY for llm (Gemini set as main, but you can use Anthropic, OpenAI, DeepSeek, Grok, Ollama or a custom LLM model):

    echo "API_KEY=your_gemini_api_key" >> .env
    

General Framework Architecture

General Diagram

Features

  • Generate AI agents for a variety of use cases.
  • Modular architecture with components such as core, llm, utils, and presets.
  • Support for multiple data processing and communication protocols.
  • Integration with popular libraries like Transformers, Langchain, and more.

Usage

Basic Example

Here’s a simple example of how to create an agent that answers questions using a generative model:

from genaitor.genaitor.core import Agent, Task
from genaitor.genaitor.llm import GeminiProvider, GeminiConfig

# Define a custom task
class QuestionAnsweringTask(Task):
    def __init__(self, description: str, goal: str, output_format: str, llm_provider):
        super().__init__(description, goal, output_format)
        self.llm = llm_provider

    def execute(self, input_data: str):
        prompt = f"""
    Task: {self.description}
    Goal: {self.goal}
    Question: {input_data}
    Please provide a response following the format:
    {self.output_format}
    """
        return self.llm.generate(prompt)

# Configure the LLM provider
llm_provider = GeminiProvider(GeminiConfig(api_key="your_api_key"))

# Create an agent
agent = Agent(name="QA Agent", task=QuestionAnsweringTask("Answering questions", "Provide accurate answers", "Text format", llm_provider))

# Execute a task
result = agent.task.execute("What is AI?")
print(result)

Multi-Agent Example

Here’s a simple example of how to create a flow using multiple agents:

import asyncio
from genaitor.genaitor.core import (
    Agent, Task, Orchestrator, Flow,
    ExecutionMode, AgentRole, TaskResult
)
from genaitor.genaitor.llm import GeminiProvider, GeminiConfig

# Define a base task (you could use different tasks for each agent)
class LLMTask(Task):
    def __init__(self, description: str, goal: str, output_format: str, llm_provider):
        super().__init__(description, goal, output_format)
        self.llm = llm_provider

    def execute(self, input_data: str) -> TaskResult:
        prompt = f"""
Task: {self.description}
Goal: {self.goal}

Input: {input_data}

Please provide a response following the format:
{self.output_format}
"""
        try:
            response = self.llm.generate(prompt)
            return TaskResult(
                success=True,
                content=response,
                metadata={{"task_type": self.description}}
            )
        except Exception as e:
            return TaskResult(
                success=False,
                content=None,
                error=str(e)
            )

# Configure the LLM provider
llm_provider = GeminiProvider(GeminiConfig(api_key="your_api_key"))

# Generating two specific tasks
qa_task = LLMTask(
    description="Question Answering",
    goal="Provide clear and accurate responses",
    output_format="Concise and informative",
    llm_provider=llm_provider
)

summarization_task = LLMTask(
    description="Text Summarization",
    goal="Summarize lengthy content into key points",
    output_format="Bullet points or short paragraph",
    llm_provider=llm_provider
)

# Create agents
qa_agent = Agent(
    role=AgentRole.SPECIALIST,
    tasks=[qa_task],
    llm_provider=llm_provider
)
summarization_agent = Agent(
    role=AgentRole.SUMMARIZER,
    tasks=[summarization_task],
    llm_provider=llm_provider
)

orchestrator = Orchestrator(
    agents={{"qa_agent": qa_agent, "summarization_agent": summarization_agent}},
    flows={{"default_flow": Flow(agents=["qa_agent", "summarization_agent"], context_pass=[True,True])}},
    mode=ExecutionMode.SEQUENTIAL
)

result_process = orchestrator.process_request('What is the impact of AI on modern healthcare?', flow_name='default_flow')
result = asyncio.run(result_process)
print(result)

Examples usage

Here is a simple guideline for running the examples

Streamlit APPs

streamlit run genaitor\apps\pinneaple.py

General examples

python genaitor\examples\autism_assistant.py

Demo Videos

Here are some demo videos showcasing Genaitor in action:

FAQ

Why should I use this framework over others like LangChain, LangGraph, CrewAI or LlamaIndex?

While popular frameworks like LangChain, LangGraph, and LlamaIndex are powerful, they are primarily designed as general-purpose agentic frameworks. Our framework is specifically optimized for Scientific Machine Learning (SciML) applications and offers the following key advantages:

  • Specific focus on Scientific Machine Learning:

Unlike generalist frameworks, we prioritize workflows tailored for scientific and physics-based AI tasks, where agent behavior often requires structured reasoning and domain-specific knowledge handling.

  • Greater control and transparency: Our design provides developers with direct access to agent modeling and lifecycle management. You are not tied to predefined abstractions or "black-box" architectures, allowing full customization to match scientific workflows.

  • Reduced learning curve: Our framework minimizes unnecessary complexity. Users can build efficient agents with a much simpler and more intuitive interface, without needing to dive deep into multiple layers of abstractions before achieving results.

Is this framework compatible with LangChain or LlamaIndex?

Our framework is independent but compatible with most libraries from the ecosystem. You can integrate components like LlamaIndex for document retrieval or LangChain tools if needed, while still maintaining full control over the agent lifecycle inside our framework.

What kind of Scientific Machine Learning tasks is this framework suited for?

This framework is designed for tasks such as:

  • Physics-informed problem solving

  • Scientific reasoning and simulation control

  • AI-driven research assistants for scientific domains

  • Autonomous agents for data-driven discovery processes

  • Interaction with physical simulation APIs, datasets, and analytical tools

If your use case involves structured reasoning, scientific models, or physics-based tasks, this framework provides the flexibility and precision you need.

Contribution Guidelines

We welcome contributions! To contribute:

  1. Fork the repository.
  2. Create a new branch (git checkout -b feature-name).
  3. Make your changes and commit (git commit -m 'Add new feature').
  4. Push to the branch (git push origin feature-name).
  5. Create a pull request.

License

This project is licensed under the MIT License.

Contact

For any questions or suggestions, feel free to open an issue or contact the maintainers at executive.enterpriselm@gmail.com or the main author Yan Barros at https://www.linkedin.com/in/yan-barros-yan

You can also check our landing-page to more news:

enterpriselm.github.io/home

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

genaitor-0.1.0.tar.gz (4.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

genaitor-0.1.0-py3-none-any.whl (4.5 kB view details)

Uploaded Python 3

File details

Details for the file genaitor-0.1.0.tar.gz.

File metadata

  • Download URL: genaitor-0.1.0.tar.gz
  • Upload date:
  • Size: 4.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for genaitor-0.1.0.tar.gz
Algorithm Hash digest
SHA256 6714165077d3c1f1422ed050f3eeddd140a4ad69fd9f25a7086e8597d1476960
MD5 dee9bd159bd34701bc17f7cc7ef33c49
BLAKE2b-256 ba92c13db710873c93e3c67d689ccfaf44cdefec782f71a5d3d1fe0fe1d24979

See more details on using hashes here.

File details

Details for the file genaitor-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: genaitor-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 4.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for genaitor-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ccc72c4bad31cabd8f6d8c37d5967117943fe788e3193a8dd81cd46736774915
MD5 30e89a26cf66d82b6bb06c52f7fa3fcd
BLAKE2b-256 fb4de9b1b138e18b38756d35c94b611330c9cb10ee796c8dbb82b8fe2875e59d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page