Skip to main content

A Python package for creating simple AI Agents using the OpenAI API.

Project description

jAIms

My name is Bot, jAIMs Bot. 🕶️

jAIms is a lightweight Python framework built on top of the OpenAI library that lets you create powerful LLM agents. It is designed with simplicity and ease of use in mind and only depends on openai and tiktoken.

Installation

pip install jaims-py

👨‍💻 Usage

Building an agent is as simple as this:

from jaims import JAImsAgent

agent = JAImsAgent()

response = agent.run([
    {
        "role": "user",
        "content": "Hi!"
    }
])

print(response)

The parameters accepted by the run method are those specified in the official OpenAI docs.

⚙️ Functions

Of course, an agent is just a chatbot if it doesn't support functions. jAIms uses the built-in OpenAI function feature to call functions you pass to it. Here's an example where we create a simple sum function and make a simple agent that lets you sum two numbers:

import jaims


def sum(a: int, b: int):
    return a + b

# this is a class that wraps your function, it will 
# receive the actual function plus all the info required 
# by the llm to invoke it.
func_wrapper = JAImsFuncWrapper(
    function=sum, 
    name="sum", 
    description="use this function when the user wants to sum two numbers",
    params_descriptors=[
        JAImsParamDescriptor(
            name="a",
            description="first operand",
            json_type=JAImsJsonSchemaType.NUMBER,
        ),
        JAImsParamDescriptor(
            name="b",
            description="second operand",
            json_type=JAImsJsonSchemaType.NUMBER,
        ),
    ],
)

# instantiate the agent passing the functions
agent = JAImsAgent(
    functions=[func_wrapper],
    model=JAImsGPTModel.GPT_3_5_TURBO_16K,
)

# a simple loop that simulates a chatbot
while True:
    user_input = input("> ")
    if user_input == "exit":
        break
    response = agent.run(
        [{"role": "user", "content": user_input}],
        stream=True,
    )

    for chunk in response:
        print(chunk, end="", flush=True)
        
        print("\n")

✨ Other features

  • Complete control over openai call parameters (temperature, top_p, n, max_tokens, etc.)
  • Automatic chat history management
  • Configuration of the OpenAI model to use
  • Injectable prompt to shape agent behavior
  • Safety checks to prevent the agent from endlessly looping over function calls

I will routinely update the examples to demonstrate more advanced features. Also, I've made sure to document the code as best as I can; everything should be self-explanatory; I plan to add a proper documentation in the future if this project gets enough traction.

🤖 Supported models

Currently, jAIms supports the new OpenAI models with functions enabled, specifically:

  • gpt-3.5-turbo-0613
  • gpt-3.5-turbo-16k-0613
  • gpt-4-0613

I'm not planning to add support for non-OpenAI models at the moment, but contributions are always appreciated.

⚠️ Project status

This is a work in progress. I still need to write some tests and add many features, but the core functionality is there. I'm creating this framework because I need a lightweight and easy-to-use framework to create LLM agents. This project may not be as advanced as tools like langchain and others, but if you need a simple tool to create agents based on the OpenAI API, you might find jAIms useful.

TODOS:

  • Add tests
  • Add more examples
  • Add more chat history optimization strategies
  • Add function calling callbacks
  • Add history persistance

📝 License

The license will be MIT, but I need to add this properly.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jaims-py-0.0.3.tar.gz (13.0 kB view details)

Uploaded Source

Built Distribution

jaims_py-0.0.3-py3-none-any.whl (13.5 kB view details)

Uploaded Python 3

File details

Details for the file jaims-py-0.0.3.tar.gz.

File metadata

  • Download URL: jaims-py-0.0.3.tar.gz
  • Upload date:
  • Size: 13.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.6

File hashes

Hashes for jaims-py-0.0.3.tar.gz
Algorithm Hash digest
SHA256 d7ce1a345a911112888f5c4f2f22f7dca9ffba41f0078c777b6ad82424a56020
MD5 5217dff15ab58b8fe4f670ce8753dff6
BLAKE2b-256 611d9694fd0bea248140ca528cfc237c0eea817b193011e7985e31167c4e699c

See more details on using hashes here.

File details

Details for the file jaims_py-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: jaims_py-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 13.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.6

File hashes

Hashes for jaims_py-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 4195e625a0c40348be35d457cc033d47fc21f1de1140cfcd274abdd4a3d766fa
MD5 465fb89c3d4b951cc81c4db602deb02c
BLAKE2b-256 d9411ddb0a2c593d4bbe622da3aa7a4f5400d31bed292991c55c4130ed9ed39a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page