Skip to main content

Use Large Language Models in Python App

Project description

useLLM - Use Large Language Models in Python App

The usellm Python library enables interaction with a chat-based Large Language Model (LLM) service. This interaction can be used to perform various language-related tasks, like generating chat conversation using the OpenAI API. It's designed as a Python port of the usellm JavaScript library.

Installation

The library can be installed with pip:

pip install usellm

Example Usage

Here is a basic usage example:

from usellm import Message, Options, UseLLM

# Initialize the service
service = UseLLM(service_url="https://usellm.org/api/llm")

# Prepare the conversation
messages = [
  Message(role="system", content="You are a helpful assistant."),
  Message(role="user", content="What can you do for me?"),
]
options = Options(messages=messages)

# Interact with the service
response = service.chat(options)

# Print the assistant's response
print(response.content)

The above code will generate a response using the OpenAI ChatGPT API. The service URL "https://usellm.org/api/llm" should be used only for testing.

Classes and Methods

1. UseLLM class

The UseLLM class provides the interface for interacting with the LLM service.

Methods:

  • __init__(self, service_url: str): Initializes a new instance of the UseLLM class.
  • chat(self, options: Options) -> Message: Interacts with the LLM using the provided Options, and returns a Message instance that represents the LLM's response.

2. Options class

The Options class represents a set of configuration options for a chat interaction with the LLM.

  • messages: A list of Message instances representing the conversation up to the current point.
  • stream: A boolean indicating if the interaction is a streaming interaction. Note: streaming is currently not supported.
  • template: A string representing a message template to guide the conversation.
  • inputs: A dictionary of additional inputs for the conversation.

Methods:

  • __init__(self, messages: Optional[List[Message]] = [], stream: Optional[bool] = None, template: Optional[str] = None, inputs: Optional[dict] = None): Initializes a new instance of the Options class.

3. Message class

The Message class represents a message in a conversation. It consists of two main attributes:

  • role: The role of the message sender. Common values could be system, user, assistant.
  • content: The content of the message.

Methods:

  • __init__(self, role: str, content: str): Initializes a new instance of the Message class.
  • __repr__(self) -> str: Returns a string representation of the Message instance.
  • __str__(self) -> str: Returns a string representation of the Message instance.
  • to_dict(self) -> dict: Returns a dictionary representation of the Message instance.
  • to_json(self) -> str: Returns a JSON string representation of the Message instance.

Exceptions

The library raises an Exception in the following situations:

  • If the stream option is set to True, because streaming is not currently supported.
  • If the HTTP response status code from the LLM service is not 200.
  • If the HTTP response from the LLM service contains an "error" field.
  • If the HTTP response from the LLM service does not contain a "choices" field.

Please create an issue to report bugs or suggest improvements. Learn more about the original JavaScript library here: https://usellm.org

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

usellm-0.0.7.tar.gz (5.1 kB view details)

Uploaded Source

Built Distribution

usellm-0.0.7-py3-none-any.whl (5.5 kB view details)

Uploaded Python 3

File details

Details for the file usellm-0.0.7.tar.gz.

File metadata

  • Download URL: usellm-0.0.7.tar.gz
  • Upload date:
  • Size: 5.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for usellm-0.0.7.tar.gz
Algorithm Hash digest
SHA256 008e2f02ae03aa37e149e3d3ab92ec8fd72a335a6aedb7c92d678fc9a7a16cf4
MD5 84eede00904ec4e2a1dbc998b22c2d0e
BLAKE2b-256 fd6bc214151216d75206cf89c42464e249bcf38a86c04289a9ac2c36dc07c47d

See more details on using hashes here.

File details

Details for the file usellm-0.0.7-py3-none-any.whl.

File metadata

  • Download URL: usellm-0.0.7-py3-none-any.whl
  • Upload date:
  • Size: 5.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for usellm-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 6c82c15228bace2f6baacc547aecb74a8b4f60ba18d5975cc667126e7511f9b8
MD5 58dced56dfc53ed959e56d4096b341d2
BLAKE2b-256 5f2d3ff77e73796c80ff199ec37bf4acdb1b9334f6364d9b4597c15abf3c08c8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page