Skip to main content

A microframework for creating simple AI agents.

Project description


AgentDingo
Agent Dingo

A microframework for building LLM-powered pipelines and agents.

Dingo is a compact LLM orchestration framework designed for straightforward development of production-ready LLM-powered applications. It combines simplicity with flexibility, allowing for the efficient construction of pipelines and agents, while maintaining a high level of control over the process.

Support us 🤝

You can support the project in the following ways:

  • ⭐ Star Dingo on GitHub (click the star button in the top right corner)
  • 💡 Provide your feedback or propose ideas in the issues section or Discord
  • 📰 Post about Dingo on LinkedIn or other platforms
  • 🔗 Check out our other projects: Scikit-LLM, Falcon

Logo

Logo

Quick Start & Documentation 🚀

Step 1: Install agent-dingo

pip install agent-dingo

Step 2: Configure your OpenAI API key

export OPENAI_API_KEY=<YOUR_KEY>

Step 3: Build your pipeline

Example 1 (Linear Pipeline):

from agent_dingo.llm.openai import OpenAI
from agent_dingo.core.blocks import PromptBuilder
from agent_dingo.core.message import UserMessage
from agent_dingo.core.state import ChatPrompt


# Model
gpt = OpenAI("gpt-3.5-turbo")

# Summary prompt block
summary_pb = PromptBuilder(
    [UserMessage("Summarize the text in 10 words: ```{text}```.")]
)

# Translation prompt block
translation_pb = PromptBuilder(
    [UserMessage("Translate the text into {language}: ```{summarized_text}```.")],
    from_state=["summarized_text"],
)

# Pipeline
pipeline = summary_pb >> gpt >> translation_pb >> gpt

input_text = """
Dingo is an ancient lineage of dog found in Australia, exhibiting a lean and sturdy physique adapted for speed and endurance, dingoes feature a wedge-shaped skull and come in colorations like light ginger, black and tan, or creamy white. They share a close genetic relationship with the New Guinea singing dog, diverging early from the domestic dog lineage. Dingoes typically form packs composed of a mated pair and their offspring, indicating social structures that have persisted through their history, dating back approximately 3,500 years in Australia.
"""

output = pipeline.run(text = input_text, language = "french")
print(output)

Example 2 (Agent):

from agent_dingo.agent import Agent
from agent_dingo.llm.openai import OpenAI
import requests

llm = OpenAI(model="gpt-3.5-turbo")
agent = Agent(llm, max_function_calls=3)

@agent.function
def get_temperature(city: str) -> str:
    """Retrieves the current temperature in a city.

    Parameters
    ----------
    city : str
        The city to get the temperature for.

    Returns
    -------
    str
        String representation of the json response from the weather api.
    """
    base_url = "https://api.openweathermap.org/data/2.5/weather"
    params = {
        "q": city,
        "appid": "<openweathermap_api_key>",
        "units": "metric"
    }
    response = requests.get(base_url, params=params)
    data = response.json()
    return str(data)

pipeline = agent.as_pipeline()

For a more detailed overview and additional examples, please refer to the documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agent_dingo-1.0.0.tar.gz (27.7 kB view hashes)

Uploaded Source

Built Distribution

agent_dingo-1.0.0-py3-none-any.whl (36.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page