Skip to main content

A Python package for creating LLM powered, agentic, platform agnostic software.

Project description

JAIms

My name is Bot, JAIms Bot. 🕶️

JAIms is a lightweight Python package that lets you build powerful LLM-Based agents or LLM powered applications with ease. It is platform agnostic, so you can focus on integrating AI into your software and let JAIms handle the boilerplate of communicating with the LLM API. The main goal of JAIms is to provide a simple and easy-to-use interface to leverage the power of LLMs in your software, without having to worry about the specifics of the underlying provider, and to seamlessly integrate LLM functionality with your own codebase. JAIms natively supports OpenAI's GPT models, Google's gemini models and Mistral models, and it can be easily extended to connect to your own model and endpoints.

Installation

To avoid overcluttering your project with dependencies, by running:

pip install jaims-py

You will get the core package that is provider independent (meaning, it won't install any dependencies other than Pillow and Pydantic). In order to also install the built in providers (currently openai, google and mistral) you can run:

pip install jaims-py[openai,google,mistral]

👨‍💻 Usage

Building an agent is as simple as this:

from jaims import JAImsAgent, JAImsMessage

agent = JAImsAgent.build(
    model="gpt-4o",
    provider="openai",
)

response = agent.run([JAImsMessage.user_message("Hello, how are you?")])

print(response)

⚙️ Function Tools

Of course, an agent is just a chatbot if it doesn't support functions. JAIms leverages LLMs function calling features seamlessly integrating with your python code. It can both invoke your python functions, or use a platform agnostic tool descriptor to return formatted results that are easily consumed by your code (using pydantic models).

Function Invocation

from jaims import JAImsAgent, JAImsMessage, jaimsfunctiontool

@jaimsfunctiontool()
def sum(a: int, b: int):
    print("invoked sum function")
    return a + b

agent = JAImsAgent.build(
    model="gpt-4o",
    provider="openai",
    tools=[sum],
)

response = agent.message([JAImsMessage.user_message("What is the sum of 42 and 420?")])
print(response)

Formatted Results

import jaims
from typing import Optional


class MotivationalQuote(jaims.BaseModel):
    quote: str = jaims.Field(description="a motivational quote")
    author: Optional[str] = jaims.Field(
        default=None, description="the author of the quote, omit if it's your own"
    )


tool_descriptor = jaims.JAImsFunctionToolDescriptor(
    name="store_motivational_quote",
    description="use this tool to store a random motivational quote based on user's preferences",
    params=MotivationalQuote,
)


random_quote = jaims.JAImsAgent.run_tool_model(
    model="gpt-4o",
    provider="openai",
    descriptor=tool_descriptor,
    messages=[
        jaims.JAImsMessage.user_message("Motivate me in becoming a morning person.")
    ],
)
print(f"Quote: {random_quote.quote}\nAuthor: {random_quote.author or 'By an AI Poet'}")

But there is much more, check outh the examples folder for more advanced or nuanced use cases.

✨ Main Features

  • Built in support for OpenAI, Google's gemini and Mistral models (more coming soon).
  • Function calling support even in streamed conversations with built in providers (openai, google, mistral).
  • Built in conversation history management to allow fast creation of chatbots, this can be easily extended to support more advanced history management strategies.
  • Image support for multimodal LLMs 🖼️
  • Error handling and exponential backoff for built in providers (openai, google, mistral)

🧠 Guiding Principles

JAIms comes out of the necessity for a lightweight and easy-to-use framework to create LLM agents or integrate LLM functionality in python projects. Given the increasing work with both foundational and open source LLMs, JAIms has been designed as an abstraction layer to streamline fast creation of agentic business logic and seamless codebase integration.

In case you like to contribute, please keep in mind that I try to keep the code:

  • Modular: any component is provided with a default basic implementation and an interface that can be easily extended for more complex use cases.
  • Type Hinted and Explicit: I've done my best to type hint everything and document the codebase as much as possible to avoid digging into the code.
  • Tested: Well...Let's just say I could have done better, but am planning to improve code coverage and test automation in the near future.
  • Application focused: I'm not trying to build a library similar to langchain or llamaindex to perform data-driven operations on LLMs, I'm trying to build a very simple and lightweight framework that leverages the possibility of LLMs to perform function calling so that LLMs can easily be integrated in software applications.
  • Extensible: I'm planning to add more providers and more features.

As a side note, I've just recently begun to employ Python for production code, therefore I might have "contaminated" the codebase with some approaches, patterns or choices that might not be idiomatic or "pythonic", I'm more than happy to receive feedback and open to suggestions on how to make the codebase cleaner and more idiomatic, hopefully without too many breaking changes.

⚠️ Project status

I'm using this library in many of my projects without problems, that said I've just revamped it entirely to support multiple providers and entirely refactored the codebase to streamline function calling. I've done my best to test it thoroughly, but I can't guarantee something won't break.

I'm actively working on this project and I'm open to contributions, so feel free to open an issue or a PR if you find something that needs fixing or improving.

My next steps will be to improve tests and documentation, and to extend the built in providers to support more models.

Since I've started the development of JAIms, a few similar projects have been started, and granted that I didn't have time to check them out yet, some might easily be more advanced, yet I've widely employed this library in my projects and those of the company I work for, and I've been actively maintaining it, so I'm planning to keep it up to date and to improve it as much as I can.

I've opted for an open source by default approach to allow others to benefit from it and force myself to keep the code clean and well documented, just remember that since this is, for now, a side-project developed just by me (that am fairly new to python), expect the possibility of encountering some issues and don't expect an immediate patch from me, any help is very much appreciated 🤗.

📝 License

Copyright (c) 2023 Marco Musella (aka Mush). This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jaims_py-2.0.0b10.tar.gz (35.8 kB view details)

Uploaded Source

Built Distribution

jaims_py-2.0.0b10-py3-none-any.whl (38.3 kB view details)

Uploaded Python 3

File details

Details for the file jaims_py-2.0.0b10.tar.gz.

File metadata

  • Download URL: jaims_py-2.0.0b10.tar.gz
  • Upload date:
  • Size: 35.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.0

File hashes

Hashes for jaims_py-2.0.0b10.tar.gz
Algorithm Hash digest
SHA256 9b3962a70d87346d24fe403eeb95059eb2cad01420491dedcb4d568dac043c33
MD5 d30c3ef322d23e2561c47fe2f1e1cf86
BLAKE2b-256 0e2fed03478d09ae5b7013d2fe8179d11dcd984ce00423032576c85bbae6bac6

See more details on using hashes here.

File details

Details for the file jaims_py-2.0.0b10-py3-none-any.whl.

File metadata

  • Download URL: jaims_py-2.0.0b10-py3-none-any.whl
  • Upload date:
  • Size: 38.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.0

File hashes

Hashes for jaims_py-2.0.0b10-py3-none-any.whl
Algorithm Hash digest
SHA256 9b0d7809c4c9700d72af6124ca17648bbe58a236b14319e13623a7915fc1312d
MD5 706b3d66efe3575b69a4ca4216a0684a
BLAKE2b-256 17d321db35bfb6ccad9ab5277a94284c85d679c5398dc4c2f442b445276dd34b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page