A Python package for creating simple AI Agents using the OpenAI API.
Project description
JAIms
My name is Bot, JAIms Bot. 🕶️
JAIms is a lightweight Python package that lets you build powerful LLM-Based agents with ease. It is platform agnostic, so you can focus on integrating AI into your software and let JAIms handle the boilerplate of communicating with the LLM API. JAIms natively supports OpenAI's GPT models and Google's gemini models (based on google's generative ai), and it can be easily extended to connect to your own model and endpoints.
Installation
To avoid overcluttering your project with dependencies, by running:
pip install jaims-py
You will get the core package that is provider independent (meaning, it won't install any dependencies other than Pillow). In order to also install the built in providers you can run:
pip install jaims-py[openai,google]
👨💻 Usage
Building an agent is as simple as this:
from jaims import JAImsAgent, JAImsMessage
agent = JAImsAgent.build(
model="gpt-4o",
provider="openai",
)
response = agent.run([JAImsMessage.user_message("Hello, how are you?")])
print(response)
⚙️ Functions
Of course, an agent is just a chatbot if it doesn't support functions. JAIms uses the built-in OpenAI tools feature to call the functions you pass to it. Here's an example where we create a simple sum function and make a simple agent that lets you sum two numbers:
def sum(a: int, b: int):
return a + b
sum_func_wrapper = JAImsFunctionTool(
function=sum,
function_tool_descriptor=JAImsFunctionToolDescriptor(
name="sum",
description="use this function when the user wants to sum two numbers",
params_descriptors=[
JAImsParamDescriptor(
name="a",
description="first operand",
json_type=JAImsJsonSchemaType.NUMBER,
),
JAImsParamDescriptor(
name="b",
description="second operand",
json_type=JAImsJsonSchemaType.NUMBER,
),
],
),
)
def main():
agent = JAImsAgent.build(
model="gpt-4o",
provider="openai",
config=JAImsLLMConfig(
temperature=1.0,
),
history_manager=JAImsDefaultHistoryManager(
history=[
JAImsMessage.assistant_message(
text="Hello, I am JAIms, your personal assistant, How can I help you today?"
)
]
),
tools=[
sum_func_wrapper,
],
)
while True:
user_input = input("> ")
if user_input == "exit":
break
response = agent.run_stream(
[JAImsMessage.user_message(text=user_input)],
)
for chunk in response:
print(chunk, end="", flush=True)
print("\n")
Check outh the examples folder for more advanced use cases.
✨ Main Features
- Lightweight (like...really, I'm rather obsessed with this, so dependencies are kept to a minimum)
- Built in support for OpenAI and Google's gemini models
- Built in history manager to allow fast creation of chatbots, this can be easily extended to support more advanced history management strategies.
- Support for images for vision models 🖼️
- Error handling and exponential backoff for built in providers (openai, google)
I will routinely update the examples to demonstrate more advanced features.
⚠️ Project status
This is a work in progress. I still need to write some (many) tests and add more QoL features, but the core functionality is there. I'm creating this package because I need a lightweight and easy-to-use framework to create LLM agents / connect to foundational LLM providers with ease. The guiding philosophy behind is to build a platform agnostic interface to easily integrate software with foundational llm models to leverage AI features, making sure that I can easily switch between LLM providers without needing to change my code, yet I'm doing my best to ensure provider specific features and extensibility.
I'm currently using and maintaining this package for my own projects and those of the company I work for, but have opted for a open source by default approach to allow others to benefit from it and force myself to keep the code clean and well documented.
TODOS:
- Simplify function passing
- Add tests
- Refactor documentation and logging entirely
📝 License
The license will be MIT, but I need to add this properly.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file jaims_py-2.0.0b3.tar.gz
.
File metadata
- Download URL: jaims_py-2.0.0b3.tar.gz
- Upload date:
- Size: 22.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 41df36e15a794e07235a3602a7f7679f28684a0daacb4e904299371d932deec7 |
|
MD5 | 0c7667525049cb4aa41fad23c21b4dd8 |
|
BLAKE2b-256 | 44187065935dc0a34e0172216b277196ae4475a8eedae32eeae521436fa415c5 |
File details
Details for the file jaims_py-2.0.0b3-py3-none-any.whl
.
File metadata
- Download URL: jaims_py-2.0.0b3-py3-none-any.whl
- Upload date:
- Size: 24.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1520573f5cfa1674a02af500dccf461b9715f07dcdeb5e6365b89dcbb2328dc4 |
|
MD5 | 7225760f442838a58974ce7b6fe7ff0e |
|
BLAKE2b-256 | 46d0cc53b1c338eaf5decbf9e01e23669262cb7041704f2f868bdaf1bb271694 |