Towards automated general intelligence.
Project description
PyPI | Documentation | Discord
LionAGI
Towards Automated General Intelligence
LionAGI is an intelligent agent framework tailored for big data analysis with advanced machine learning tools. Designed for data-centric, production-level projects. Lionagi allows flexible and rapid design of agentic workflow, customed for your own data. Lionagi agents
can manage and direct other agents, can also use multiple different tools in parallel.
Integrate any Advanced Model into your existing workflow.
Install LionAGI with pip:
pip install lionagi
Download the .env_template
file, input your appropriate API_KEY
, save the file, rename as .env
and put in your project's root directory.
by default we use OPENAI_API_KEY
.
Intelligence Services
Provider | Type | Parallel Chat | Perform Action | Embeddings | MultiModal |
---|---|---|---|---|---|
OpenAI | API | ✅ | ✅ | ||
OpenRouter | API | ✅ | |||
Ollama | Local | ✅ | |||
LiteLLM | Mixed | ✅ | |||
HuggingFace | Local | ✅ | |||
MLX | Local | ✅ | |||
Anthropic | API | ||||
Azure | API | ||||
Amazon | API | ||||
API | |||||
MistralAI | API |
Quick Start
The following example shows how to use LionAGI's Session
object to interact with gpt-4-turbo
model:
# define system messages, context and user instruction
system = "You are a helpful assistant designed to perform calculations."
instruction = {"Addition":"Add the two numbers together i.e. x+y"}
context = {"x": 10, "y": 5}
# in interactive environment (.ipynb for example)
import lionagi as li
calculator = li.Session(system=system)
result = await calculator.chat(
instruction=instruction, context=context, model="gpt-4-turbo-preview"
)
print(f"Calculation Result: {result}")
# or otherwise, you can use
import asyncio
from dotenv import load_dotenv
load_dotenv()
import lionagi as li
async def main():
calculator = li.Session(system=system)
result = await calculator.chat(
instruction=instruction, context=context, model="gpt-4-turbo-preview"
)
print(f"Calculation Result: {result}")
if __name__ == "__main__":
asyncio.run(main())
Visit our notebooks for examples.
LionAGI is designed to be asynchronous
only, please check python official documentation on how async
work: here
Notice:
- calling API with maximum throughput over large set of data with advanced models i.e. gpt-4 can get EXPENSIVE IN JUST SECONDS,
- please know what you are doing, and check the usage on OpenAI regularly
- default rate limits are set to be 1,000 requests, 100,000 tokens per miniute, please check the OpenAI usage limit documentation you can modify token rate parameters to fit different use cases.
- if you would like to build from source, please download the latest release, main is under development and will be changed without notice
Community
We encourage contributions to LionAGI and invite you to enrich its features and capabilities. Engage with us and other community members Join Our Discord
Citation
When referencing LionAGI in your projects or research, please cite:
@software{Li_LionAGI_2023,
author = {Haiyang Li},
month = {12},
year = {2023},
title = {LionAGI: Towards Automated General Intelligence},
url = {https://github.com/lion-agi/lionagi},
}
Requirements
Python 3.9 or higher.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file lionagi-0.0.209.tar.gz
.
File metadata
- Download URL: lionagi-0.0.209.tar.gz
- Upload date:
- Size: 112.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 22fa0cfd89eda1e81b6c9a1e815bcc89e23f6c24dd5bb04698e57255ed631faa |
|
MD5 | f1a550761570a9522359975509ef8e97 |
|
BLAKE2b-256 | 0cfc83a0cb549e3c6299806ffb89bd7e5c2f679ebcaa76a2bc7fae1a09ab50ce |
File details
Details for the file lionagi-0.0.209-py3-none-any.whl
.
File metadata
- Download URL: lionagi-0.0.209-py3-none-any.whl
- Upload date:
- Size: 133.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fbeb5e41d06fbdfbbc2131fbb90da99b5f476872bbb9bb55c6b8be17afd9602d |
|
MD5 | e69971f4b81acfb35d6f2bb16fa45c08 |
|
BLAKE2b-256 | c0748495da03c6f5ccd3917bd5328f93afa8f248ed505e6ce9736a0beca26e9b |