A python module for conversational AI with ollama
Project description
chatollama
chatollama A Python module to streamline conversational AI with the ollama library, providing efficient and customizable chat engines. ChatOllama offers conversation management and configuration options, ideal for building interactive assistants, customer service bots, and other conversational AI applications.
Features
- Engine Class: Handles conversation flow, manages message attributes, and facilitates model responses with advanced configuration options.
- Conversation Tree: Supports branching conversations with a tree structure, enabling complex, multi-threaded interactions.
- Event Handling: Customizable events for response streaming, tool usage, and callback functions.
- Generation Parameters: Easily adjustable settings for response generation, including modes for creative, coding, and storytelling outputs.
Installation
To install ChatOllama, use the following pip command:
pip install chatollama
Usage Examples
Basic Usage
Setting Up a Conversation
from chatollama import Engine
# Initialize engine and start a conversation
engine = Engine(model="llama3.1:8b")
# Add user and assistant messages
engine.user("Hello, how are you?")
engine.assistant("Fantastic! I'm here to assist you. How can I help?")
engine.user("Great, can we get started with making a python project?")
# Start chat
engine.chat()
Branching and Tree Traversal
ChatOllama supports branching, allowing users to handle conversations that diverge based on user inputs.
conversation = engine.conversation
# Add messages and branch conversation
user_node = conversation.add_message(role="user", content="Tell me a story.")
branch_node = conversation.branch_message(user_node, role="assistant", content="Once upon a time...")
conversation.print_tree(conversation.root)
Customizing Generation Parameters
To create focused, creative, or story-driven responses, ChatOllama provides multiple configuration options.
from chatollama import GenerationParameters
# Set engine options for storytelling
engine.options = GenerationParameters().story_telling()
# Set a user message
engine.user("Create a fantasy story for me.")
engine.chat()
Advanced Features
Response Events
Attach custom callback functions to handle responses and events.
# Define a callback function for responses
def on_response(message):
print("Response:", message)
# Register the callback function
engine.response_event.on(on_response)
# Send a message and trigger callback
engine.user("What's the weather like today?")
engine.chat()
Vision Support
ChatOllama allows vision-based responses for supported models.
engine = Engine("llama3.2-vision:11b")
engine.stream = True
engine.conversation.user(
"Tell me about this image, 2 sentences please", images=["path\\to\\earth.png"]) # As you can see, any kwarg added to a message will be sent as part of the message dict that ollama is expecting. Right now there is really only 'images' that can be sent but in the future it might be other things like videos or other files
def print_stream(mode, delta, text):
if mode == 0:
print("[AI]:")
elif mode == 1:
print(delta, end="")
elif mode == 2:
print("")
engine.stream_event.on(print_stream)
engine.chat()
# In the console it will print over time something like this:
# The image shows a photograph of the Earth from space, with North America and Asia visible on either side of the Indian Ocean.
# The photo is centered in the middle of the planet's curvature, making its spherical shape apparent.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file chatollama-0.2.13.tar.gz.
File metadata
- Download URL: chatollama-0.2.13.tar.gz
- Upload date:
- Size: 10.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4e7e3e6dfbc96860beb20c28ea1a0dcadcf2d063028315e5027af0329d669241
|
|
| MD5 |
d87a2deee2c11d114b7f2d7e9e1ff45e
|
|
| BLAKE2b-256 |
081735cbd93602734aaf5eace05ae62cfad9caa2f9fc3fa120ef1d56219f9d93
|
File details
Details for the file chatollama-0.2.13-py3-none-any.whl.
File metadata
- Download URL: chatollama-0.2.13-py3-none-any.whl
- Upload date:
- Size: 8.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e9624ff662453c8542e4cb1a2bf8831be4c82b1d3e89157f897309ace7ec4f2a
|
|
| MD5 |
f5d7a94b27edc9368279930f54ded653
|
|
| BLAKE2b-256 |
803226d492ce22db68468e8a21cc201bb7f8c50d0ca6835604218fcb15a79645
|