Skip to main content

A client library for LoLLMs generate endpoint

Project description

lollms_client

Python Version PyPI Downloads Apache License

Welcome to the lollms_client repository! This library is built by ParisNeo and provides a convenient way to interact with the lollms (Lord Of Large Language Models) API. It is available on PyPI and distributed under the Apache 2.0 License.

Installation

To install the library from PyPI using pip, run:

pip install lollms-client

Usage

To use the lollms_client, first import the necessary classes:

from lollms_client import LollmsClient

# Initialize the LollmsClient instance
lc = LollmsClient("http://localhost:9600")

Text Generation

Use generate_text() for generating text from the lollms API.

response = lc.generate_text(prompt="Once upon a time", stream=False, temperature=0.5)
print(response)

Completion

Use generate_completion() for getting a completion of the prompt from the lollms API.

response = lc.generate_completion(prompt="What is the capital of France", stream=False, temperature=0.5)
print(response)

List Mounted Personalities

List mounted personalities of the lollms API with the listMountedPersonalities() method.

response = lc.listMountedPersonalities()
print(response)

List Models

List available models of the lollms API with the listModels() method.

response = lc.listModels()
print(response)

Complete Example

from lollms_client import LollmsClient

# Initialize the LollmsClient instance
lc = LollmsClient("http://localhost:9600")

# Generate Text
response = lc.generate_text(prompt="Once upon a time", stream=False, temperature=0.5)
print(response)

# Generate Completion
response = lc.generate_completion(prompt="What is the capital of France", stream=False, temperature=0.5)
print(response)

# List Mounted Personalities
response = lc.listMountedPersonalities()
print(response)

# List Models
response = lc.listModels()
print(response)

Feel free to contribute to the project by submitting issues or pull requests. Follow ParisNeo on GitHub, Twitter, Discord, Sub-Reddit, and Instagram for updates and news.

Happy coding!

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lollms_client-0.5.8.tar.gz (34.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lollms_client-0.5.8-py3-none-any.whl (38.7 kB view details)

Uploaded Python 3

File details

Details for the file lollms_client-0.5.8.tar.gz.

File metadata

  • Download URL: lollms_client-0.5.8.tar.gz
  • Upload date:
  • Size: 34.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.8

File hashes

Hashes for lollms_client-0.5.8.tar.gz
Algorithm Hash digest
SHA256 e3f027bb127b7310db8766f2147062c0a29a5cfc0bf938f90e7dafaa33adcb83
MD5 bb116808caed09ee710ab8efb56c38f0
BLAKE2b-256 94200ae58eb71a8354e36ad8cf5a7ad91d0347cc4db0171d14173ea854fa2c30

See more details on using hashes here.

File details

Details for the file lollms_client-0.5.8-py3-none-any.whl.

File metadata

  • Download URL: lollms_client-0.5.8-py3-none-any.whl
  • Upload date:
  • Size: 38.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.8

File hashes

Hashes for lollms_client-0.5.8-py3-none-any.whl
Algorithm Hash digest
SHA256 b7ccd31a3dc15e3550e16cd55728e6f03f58174625fd4f455e21e4b40027b6f7
MD5 2145300a60cc9e71d7ee6be3d103d5f2
BLAKE2b-256 c17415a3b70d61aabe278258422ce9b737a5d7c842c141649ccfdb706f2b803e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page