Skip to main content

Language Model Proxy

Project description

ChatProxy

The goal of ChatProxy is providing python library that enables developers to effortlessly interface with various LLMs such as OpenAI's GPT-4, Google's BERT, and others.

This package encapsulates the complex interactions with these LLMs behind intuitive and user-friendly proxy classes, allowing developers to focus on leveraging the power of these advanced models, without being bogged down by intricate API details and interfacing code.

Features (WIP)

  1. Multiple LLM Support: The package supports multiple LLMs, including but not limited to GPT-4, BERT, and others. It is also designed with flexibility in mind to accommodate future models.

  2. Uniform API: It provides a uniform API for different LLMs, allowing developers to switch between models without the need to extensively change the codebase.

  3. Error Handling: It includes built-in error handling and retry logic, ensuring that your application remains resilient against minor network hiccups and transient errors.

  4. Optimized Performance: With caching and other optimizations, this package makes sure you get the best possible performance out of your chosen LLM.

  5. Asynchronous Support: For developers who require high performance and non-blocking code execution, this package offers asynchronous methods.

Installation

You can install the chatproxy package via pip:

pip install chatproxy

Quickstart

Here's an example of how to create a ChatGPT proxy and use it to generate text:

import chatproxy

session = chatproxy.Session()
while (prompt := input("you : ")) != "exit":
    message = session.send(prompt)
    print(f"\n{message['role']} : {message['content']}\n")

Documentation

For detailed information on using this package, please refer to our documentation.

Contributing

We welcome contributions! Please see our contributing guidelines for details.

License

This project is licensed under the terms of the MIT license. See LICENSE for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatproxy-0.0.1.tar.gz (2.9 kB view details)

Uploaded Source

Built Distribution

chatproxy-0.0.1-py3-none-any.whl (3.7 kB view details)

Uploaded Python 3

File details

Details for the file chatproxy-0.0.1.tar.gz.

File metadata

  • Download URL: chatproxy-0.0.1.tar.gz
  • Upload date:
  • Size: 2.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for chatproxy-0.0.1.tar.gz
Algorithm Hash digest
SHA256 3dcc7932f0451b4ee5a3cedf8b8d5345e39a40a8c504590186f3b71b6b9a8647
MD5 16d2cf84dd92192f104884f870b4e393
BLAKE2b-256 b0d962dbe8677728b70e518cc00ba710ffb8ec58dcf0f30a3229398cb75019d7

See more details on using hashes here.

File details

Details for the file chatproxy-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: chatproxy-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 3.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for chatproxy-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 86188779b01a9e41f5577ab00ebee16d549493187bc31585606b666f9b730d26
MD5 e545358f289ab551cf31d2afdb4af9ea
BLAKE2b-256 07008c3439baa9d9fe60fe5dbfbe3b3529fc0c6a5b33fcc6d0cc75c42372a7aa

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page