Skip to main content

A platform for AI Agents and AI Agents products generation.

Project description

Genaitor Logo

GenAItor

A platform for AI Agents and AI Agents products generation.

Overview

GenAItor is a cutting-edge platform designed to generate AI agents and related products that help automate complex tasks and processes. It leverages state-of-the-art machine learning libraries and tools to deliver flexible and scalable AI solutions.

To install the required dependencies, follow these steps:

  1. Clone the repository:

    git clone https://github.com/enterpriselm/genaitor.git
    cd genaitor
    
  2. Create a virtual environment (optional but recommended):

    python -m venv venv
    source venv/bin/activate  # On Windows use `venv\Scripts\activate`
    
  3. Important:
    This project is best run with Python 3.12 to ensure compatibility and avoid potential errors.

  4. Install the required packages:

    pip install -e .
    
  5. Add API_KEY for llm (Gemini set as main, but you can use Anthropic, OpenAI, DeepSeek, Grok, Ollama or a custom LLM model):

    echo "API_KEY=your_gemini_api_key" >> .env
    

General Framework Architecture

General Diagram

Features

  • Generate AI agents for a variety of use cases.
  • Modular architecture with components such as core, llm, utils, and presets.
  • Support for multiple data processing and communication protocols.
  • Integration with popular libraries like Transformers, Langchain, and more.

Usage

Basic Example

Here’s a simple example of how to create an agent that answers questions using a generative model:

from genaitor.core import Agent, Task
from genaitor.llm import GeminiProvider, GeminiConfig

# Define a custom task
class QuestionAnsweringTask(Task):
    def __init__(self, description: str, goal: str, output_format: str, llm_provider):
        super().__init__(description, goal, output_format)
        self.llm = llm_provider

    def execute(self, input_data: str):
        prompt = f"""
    Task: {self.description}
    Goal: {self.goal}
    Question: {input_data}
    Please provide a response following the format:
    {self.output_format}
    """
        return self.llm.generate(prompt)

# Configure the LLM provider
llm_provider = GeminiProvider(GeminiConfig(api_key="your_api_key"))

# Create an agent
agent = Agent(name="QA Agent", task=QuestionAnsweringTask("Answering questions", "Provide accurate answers", "Text format", llm_provider))

# Execute a task
result = agent.task.execute("What is AI?")
print(result)

Multi-Agent Example

Here’s a simple example of how to create a flow using multiple agents:

import asyncio
from genaitor.core import (
    Agent, Task, Orchestrator, Flow,
    ExecutionMode, AgentRole, TaskResult
)
from genaitor.llm import GeminiProvider, GeminiConfig

# Define a base task (you could use different tasks for each agent)
class LLMTask(Task):
    def __init__(self, description: str, goal: str, output_format: str, llm_provider):
        super().__init__(description, goal, output_format)
        self.llm = llm_provider

    def execute(self, input_data: str) -> TaskResult:
        prompt = f"""
Task: {self.description}
Goal: {self.goal}

Input: {input_data}

Please provide a response following the format:
{self.output_format}
"""
        try:
            response = self.llm.generate(prompt)
            return TaskResult(
                success=True,
                content=response,
                metadata={{"task_type": self.description}}
            )
        except Exception as e:
            return TaskResult(
                success=False,
                content=None,
                error=str(e)
            )

# Configure the LLM provider
llm_provider = GeminiProvider(GeminiConfig(api_key="your_api_key"))

# Generating two specific tasks
qa_task = LLMTask(
    description="Question Answering",
    goal="Provide clear and accurate responses",
    output_format="Concise and informative",
    llm_provider=llm_provider
)

summarization_task = LLMTask(
    description="Text Summarization",
    goal="Summarize lengthy content into key points",
    output_format="Bullet points or short paragraph",
    llm_provider=llm_provider
)

# Create agents
qa_agent = Agent(
    role=AgentRole.SPECIALIST,
    tasks=[qa_task],
    llm_provider=llm_provider
)
summarization_agent = Agent(
    role=AgentRole.SUMMARIZER,
    tasks=[summarization_task],
    llm_provider=llm_provider
)

orchestrator = Orchestrator(
    agents={{"qa_agent": qa_agent, "summarization_agent": summarization_agent}},
    flows={{"default_flow": Flow(agents=["qa_agent", "summarization_agent"], context_pass=[True,True])}},
    mode=ExecutionMode.SEQUENTIAL
)

result_process = orchestrator.process_request('What is the impact of AI on modern healthcare?', flow_name='default_flow')
result = asyncio.run(result_process)
print(result)

Demo Videos

Here are some demo videos showcasing Genaitor in action:

FAQ

Why should I use this framework over others like LangChain, LangGraph, CrewAI or LlamaIndex?

While popular frameworks like LangChain, LangGraph, and LlamaIndex are powerful, they are primarily designed as general-purpose agentic frameworks. Our framework is specifically optimized for Scientific Machine Learning (SciML) applications and offers the following key advantages:

  • Specific focus on Scientific Machine Learning:

Unlike generalist frameworks, we prioritize workflows tailored for scientific and physics-based AI tasks, where agent behavior often requires structured reasoning and domain-specific knowledge handling.

  • Greater control and transparency: Our design provides developers with direct access to agent modeling and lifecycle management. You are not tied to predefined abstractions or "black-box" architectures, allowing full customization to match scientific workflows.

  • Reduced learning curve: Our framework minimizes unnecessary complexity. Users can build efficient agents with a much simpler and more intuitive interface, without needing to dive deep into multiple layers of abstractions before achieving results.

Is this framework compatible with LangChain or LlamaIndex?

Our framework is independent but compatible with most libraries from the ecosystem. You can integrate components like LlamaIndex for document retrieval or LangChain tools if needed, while still maintaining full control over the agent lifecycle inside our framework.

What kind of Scientific Machine Learning tasks is this framework suited for?

This framework is designed for tasks such as:

  • Physics-informed problem solving

  • Scientific reasoning and simulation control

  • AI-driven research assistants for scientific domains

  • Autonomous agents for data-driven discovery processes

  • Interaction with physical simulation APIs, datasets, and analytical tools

If your use case involves structured reasoning, scientific models, or physics-based tasks, this framework provides the flexibility and precision you need.

Contribution Guidelines

We welcome contributions! To contribute:

  1. Fork the repository.
  2. Create a new branch (git checkout -b feature-name).
  3. Make your changes and commit (git commit -m 'Add new feature').
  4. Push to the branch (git push origin feature-name).
  5. Create a pull request.

License

This project is licensed under the MIT License.

Contact

For any questions or suggestions, feel free to open an issue or contact the maintainers at enterpriselearningmachines@gmail.com or the main author Yan Barros at https://www.linkedin.com/in/yan-barros-yan

You can also check our landing-page to more news:

enterpriselm.github.io/home

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

genaitor-1.0.1.tar.gz (5.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

genaitor-1.0.1-py3-none-any.whl (4.8 kB view details)

Uploaded Python 3

File details

Details for the file genaitor-1.0.1.tar.gz.

File metadata

  • Download URL: genaitor-1.0.1.tar.gz
  • Upload date:
  • Size: 5.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for genaitor-1.0.1.tar.gz
Algorithm Hash digest
SHA256 2765cbe7423d47b8025898831125992b92dee5518402c94ebc576423c09a898d
MD5 55b59f4fba772f6547cb5b25a4e01772
BLAKE2b-256 0fcb1e42ccef2c6efec6a6fc61f38c02d60e2af2cf93c7cff22fd1ecb5d05779

See more details on using hashes here.

File details

Details for the file genaitor-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: genaitor-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 4.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for genaitor-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f64e79bf80c8228f13b0126530dc7f573c67b263f7ea603d8a11ec942fa23f3b
MD5 323c5f773d8cc09c4c645a2677e8958a
BLAKE2b-256 3d38581f7a0bb8020eedcd2d94eaa787d5faba57ec413af860da8951d0eedfaa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page