A client library for LoLLMs generate endpoint
Project description
lollms_client
Welcome to the lollms_client repository! This library is built by ParisNeo and provides a convenient way to interact with the lollms (Lord Of Large Language Models) API. It is available on PyPI and distributed under the Apache 2.0 License.
Installation
To install the library from PyPI using pip
, run:
pip install lollms-client
Usage
To use the lollms_client, first import the necessary classes:
from lollms_client import LollmsClient
# Initialize the LollmsClient instance
lc = LollmsClient("http://localhost:9600")
Text Generation
Use generate_text()
for generating text from the lollms API.
response = lc.generate_text(prompt="Once upon a time", stream=False, temperature=0.5)
print(response)
Completion
Use generate_completion()
for getting a completion of the prompt from the lollms API.
response = lc.generate_completion(prompt="What is the capital of France", stream=False, temperature=0.5)
print(response)
List Mounted Personalities
List mounted personalities of the lollms API with the listMountedPersonalities()
method.
response = lc.listMountedPersonalities()
print(response)
List Models
List available models of the lollms API with the listModels()
method.
response = lc.listModels()
print(response)
Complete Example
from lollms_client import LollmsClient
# Initialize the LollmsClient instance
lc = LollmsClient("http://localhost:9600")
# Generate Text
response = lc.generate_text(prompt="Once upon a time", stream=False, temperature=0.5)
print(response)
# Generate Completion
response = lc.generate_completion(prompt="What is the capital of France", stream=False, temperature=0.5)
print(response)
# List Mounted Personalities
response = lc.listMountedPersonalities()
print(response)
# List Models
response = lc.listModels()
print(response)
Feel free to contribute to the project by submitting issues or pull requests. Follow ParisNeo on GitHub, Twitter, Discord, Sub-Reddit, and Instagram for updates and news.
Happy coding!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file lollms_client-0.6.6.tar.gz
.
File metadata
- Download URL: lollms_client-0.6.6.tar.gz
- Upload date:
- Size: 48.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1c6f5df8db5edda7ae3174b305fc32bd716847ea9bc98d043d9f2c120379f31e |
|
MD5 | e63d69853124b6d251cab867aa837896 |
|
BLAKE2b-256 | 72aedc1278c1f8c3514641f9b9d2f1428b447df1c2deacec1bd0a9065541f971 |
File details
Details for the file lollms_client-0.6.6-py3-none-any.whl
.
File metadata
- Download URL: lollms_client-0.6.6-py3-none-any.whl
- Upload date:
- Size: 52.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b2395db8783030ee9fed93b20ca15f6acf0c533c1a1b724a3d0bf20838ec71a8 |
|
MD5 | 76f6aefd3539481a9c1b2d1dd5b3aade |
|
BLAKE2b-256 | 10a09eef7e12bcd546630cae85dbb4cb788d99d62c707694c2b13a197f93733a |