A Python package for creating simple AI Agents using the OpenAI API.
Project description
JAIms
My name is Bot, JAIms Bot. 🕶️
JAIms is a lightweight Python package that lets you build powerful LLM-Based agents with ease. It is platform agnostic, so you can focus on integrating AI into your software and let JAIms handle the boilerplate of communicating with the LLM API. JAIms natively supports OpenAI's GPT models and Google's gemini models (based on google's generative ai), and it can be easily extended to connect to your own model and endpoints.
Installation
To avoid overcluttering your project with dependencies, by running:
pip install jaims-py
You will get the core package that is provider independent (meaning, it won't install any dependencies other than Pillow). In order to also install the built in providers you can run:
pip install jaims-py[openai,google]
👨💻 Usage
Building an agent is as simple as this:
from jaims import JAImsAgent, JAImsMessage
agent = JAImsAgent.build(
model="gpt-4o",
provider="openai",
)
response = agent.run([JAImsMessage.user_message("Hello, how are you?")])
print(response)
⚙️ Functions
Of course, an agent is just a chatbot if it doesn't support functions. JAIms uses the built-in OpenAI tools feature to call the functions you pass to it. Here's an example where we create a simple sum function and make a simple agent that lets you sum two numbers:
def sum(a: int, b: int):
return a + b
sum_func_wrapper = JAImsFunctionTool(
function=sum,
function_tool_descriptor=JAImsFunctionToolDescriptor(
name="sum",
description="use this function when the user wants to sum two numbers",
params_descriptors=[
JAImsParamDescriptor(
name="a",
description="first operand",
json_type=JAImsJsonSchemaType.NUMBER,
),
JAImsParamDescriptor(
name="b",
description="second operand",
json_type=JAImsJsonSchemaType.NUMBER,
),
],
),
)
def main():
agent = JAImsAgent.build(
model="gpt-4o",
provider="openai",
config=JAImsLLMConfig(
temperature=1.0,
),
history_manager=JAImsDefaultHistoryManager(
history=[
JAImsMessage.assistant_message(
text="Hello, I am JAIms, your personal assistant, How can I help you today?"
)
]
),
tools=[
sum_func_wrapper,
],
)
while True:
user_input = input("> ")
if user_input == "exit":
break
response = agent.run_stream(
[JAImsMessage.user_message(text=user_input)],
)
for chunk in response:
print(chunk, end="", flush=True)
print("\n")
Check outh the examples folder for more advanced use cases.
✨ Main Features
- Lightweight (like...really, I'm rather obsessed with this, so dependencies are kept to a minimum)
- Built in support for OpenAI and Google's gemini models
- Built in history manager to allow fast creation of chatbots, this can be easily extended to support more advanced history management strategies.
- Support for images for vision models 🖼️
- Error handling and exponential backoff for built in providers (openai, google)
I will routinely update the examples to demonstrate more advanced features.
⚠️ Project status
This is a work in progress. I still need to write some (many) tests and add more QoL features, but the core functionality is there. I'm creating this package because I need a lightweight and easy-to-use framework to create LLM agents / connect to foundational LLM providers with ease. The guiding philosophy behind is to build a platform agnostic interface to easily integrate software with foundational llm models to leverage AI features, making sure that I can easily switch between LLM providers without needing to change my code, yet I'm doing my best to ensure provider specific features and extensibility.
I'm currently using and maintaining this package for my own projects and those of the company I work for, but have opted for a open source by default approach to allow others to benefit from it and force myself to keep the code clean and well documented.
TODOS:
- Simplify function passing
- Add tests
- Refactor documentation and logging entirely
📝 License
The license will be MIT, but I need to add this properly.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file jaims_py-2.0.0b2.tar.gz
.
File metadata
- Download URL: jaims_py-2.0.0b2.tar.gz
- Upload date:
- Size: 21.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d2e42e50f9b298406c6aef1c21bbdca56ab842fb07a8cf15d4e4cc26d773fda4 |
|
MD5 | 93419308cf1ec7caa056937a275066d3 |
|
BLAKE2b-256 | ca70f3273dc7b95f23230841db7d759f9bacf9e88177afe3f2ee4cc26a9942f7 |
File details
Details for the file jaims_py-2.0.0b2-py3-none-any.whl
.
File metadata
- Download URL: jaims_py-2.0.0b2-py3-none-any.whl
- Upload date:
- Size: 22.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 171c7c479f27918648bf3993de8aeece866268055aa2086a7b0a216fef5c8ff2 |
|
MD5 | 22ca868bec229257da545e20e04d788c |
|
BLAKE2b-256 | 4741c14a42f252fcaf99e17ef873b875a836862635002a5f2db43ffa40357c8c |