A holisitic for interfacing with ollama and other llm hosts. Created mainly for private experimentation.
Project description
Python Module for Conversational AI Interactions
This Python module, built around the OllamaClient
, facilitates generating text completions and managing interactive chat sessions. It is designed to serve as a foundational tool for developers, researchers, and hobbyists who are exploring conversational AI technologies. The module provides a straightforward interface for sending prompts to a conversational AI model and receiving generated responses, suitable for a wide range of applications from chatbots to creative writing aids.
Features
- Sandbox Setup: Automates the creation of a sandbox environment for safe and isolated operation.
- Text Generation: Supports sending individual prompts to the AI model and receiving text completions.
- Interactive Chat: Allows for dynamic chat sessions with the AI, enabling real-time conversation simulations.
Getting Started
Installation
Ensure Python 3.6+ is installed. Clone this repository and install the required dependencies to get started:
git clone https://github.com/yourusername/conversational-ai-module.git
cd conversational-ai-module
pip install -r requirements.txt
Quick Start
Import the OllamaClient
in your Python script to begin interacting with the conversational AI model:
from interface.cls_ollama_client import OllamaClient
# Initialize the client
client = OllamaClient()
# Generate a single completion
response = client.generate_completion("Your prompt here.")
print(response)
Running Interactive Chat
To engage in an interactive chat session, you can use the following pattern in your script:
client = OllamaClient()
while True:
user_input = input("Enter your prompt: ")
response = client.generate_completion(user_input)
print(response)
Sandbox Environment
The module includes functionality to set up a sandbox environment, isolating your interactions and data. This is particularly useful for testing and development purposes.
Setup Sandbox
Call setup_sandbox()
before starting your session to prepare the environment:
from your_module import setup_sandbox
setup_sandbox()
Contributing
Contributions are welcome! Please feel free to submit pull requests, report bugs, or suggest features.
License
This project is licensed under the MIT License - see the LICENSE file for more details.
Acknowledgments
- Thanks to the developers and contributors who made this project possible.
- Special thanks to OpenAI for providing the API and support for conversational AI research and development.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file language_model_toolkit-0.1.tar.gz
.
File metadata
- Download URL: language_model_toolkit-0.1.tar.gz
- Upload date:
- Size: 13.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 88ff4cd66afa755574131cfcdfd21cea7725ec35e6d27a6982e49bf1b129e5a4 |
|
MD5 | 4dc71295872b29d3a57d76008a349b0c |
|
BLAKE2b-256 | 86f4f1561d07ff41cdf0232c9bdbbc2cb2584b96a20a24cdab87710fe0ba72f4 |
File details
Details for the file language_model_toolkit-0.1-py3-none-any.whl
.
File metadata
- Download URL: language_model_toolkit-0.1-py3-none-any.whl
- Upload date:
- Size: 13.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f8d7fc2717a93551702b0f808f4eb28e98aad99feb3574d0d8fac89f77defdc7 |
|
MD5 | fcfafd27ea445f32ed19c8c21dbf299c |
|
BLAKE2b-256 | b29def1bbcbc14829cbafe34082f20e3cd8894fabb530273cccfc1b5616fd879 |