A microframework for creating simple AI agents.
Project description
Agent Dingo
A microframework for building LLM-powered pipelines and agents.
Dingo is a compact LLM orchestration framework designed for straightforward development of production-ready LLM-powered applications. It combines simplicity with flexibility, allowing for the efficient construction of pipelines and agents, while maintaining a high level of control over the process.
Support us 🤝
You can support the project in the following ways:
- ⭐ Star Dingo on GitHub (click the star button in the top right corner)
- 💡 Provide your feedback or propose ideas in the issues section or Discord
- 📰 Post about Dingo on LinkedIn or other platforms
- 🔗 Check out our other projects: Scikit-LLM, Falcon
Quick Start & Documentation 🚀
Step 1: Install agent-dingo
pip install agent-dingo
Step 2: Configure your OpenAI API key
export OPENAI_API_KEY=<YOUR_KEY>
Step 3: Build your pipeline
Example 1 (Linear Pipeline):
from agent_dingo.llm.openai import OpenAI
from agent_dingo.core.blocks import PromptBuilder
from agent_dingo.core.message import UserMessage
from agent_dingo.core.state import ChatPrompt
# Model
gpt = OpenAI("gpt-3.5-turbo")
# Summary prompt block
summary_pb = PromptBuilder(
[UserMessage("Summarize the text in 10 words: ```{text}```.")]
)
# Translation prompt block
translation_pb = PromptBuilder(
[UserMessage("Translate the text into {language}: ```{summarized_text}```.")],
from_state=["summarized_text"],
)
# Pipeline
pipeline = summary_pb >> gpt >> translation_pb >> gpt
input_text = """
Dingo is an ancient lineage of dog found in Australia, exhibiting a lean and sturdy physique adapted for speed and endurance, dingoes feature a wedge-shaped skull and come in colorations like light ginger, black and tan, or creamy white. They share a close genetic relationship with the New Guinea singing dog, diverging early from the domestic dog lineage. Dingoes typically form packs composed of a mated pair and their offspring, indicating social structures that have persisted through their history, dating back approximately 3,500 years in Australia.
"""
output = pipeline.run(text = input_text, language = "french")
print(output)
Example 2 (Agent):
from agent_dingo.agent import Agent
from agent_dingo.llm.openai import OpenAI
import requests
llm = OpenAI(model="gpt-3.5-turbo")
agent = Agent(llm, max_function_calls=3)
@agent.function
def get_temperature(city: str) -> str:
"""Retrieves the current temperature in a city.
Parameters
----------
city : str
The city to get the temperature for.
Returns
-------
str
String representation of the json response from the weather api.
"""
base_url = "https://api.openweathermap.org/data/2.5/weather"
params = {
"q": city,
"appid": "<openweathermap_api_key>",
"units": "metric"
}
response = requests.get(base_url, params=params)
data = response.json()
return str(data)
pipeline = agent.as_pipeline()
For a more detailed overview and additional examples, please refer to the documentation.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
agent_dingo-1.0.0.tar.gz
(27.7 kB
view hashes)
Built Distribution
Close
Hashes for agent_dingo-1.0.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 73d2fde8b23afc9a212f8c27f5bf988d73e15df12f2faf69c90546fa80bdd01b |
|
MD5 | 823e21e21c15a5ede27b583f3410524c |
|
BLAKE2b-256 | 2e50f98b4a1d373cd91348f53045e8bb53dc5950a17ec921117fa0ca160993b1 |