Skip to main content

Language Model Proxy

Project description

ConnectLM

The goal of ConnectLM is providing python library that enables developers to effortlessly interface with various LLMs such as OpenAI's GPT-4, Google's BERT, and others.

This package encapsulates the complex interactions with these LLMs behind intuitive and user-friendly proxy classes, allowing developers to focus on leveraging the power of these advanced models, without being bogged down by intricate API details and interfacing code.

Features (WIP)

  1. Multiple LLM Support: The package supports multiple LLMs, including but not limited to GPT-4, BERT, and others. It is also designed with flexibility in mind to accommodate future models.

  2. Uniform API: It provides a uniform API for different LLMs, allowing developers to switch between models without the need to extensively change the codebase.

  3. Error Handling: It includes built-in error handling and retry logic, ensuring that your application remains resilient against minor network hiccups and transient errors.

  4. Optimized Performance: With caching and other optimizations, this package makes sure you get the best possible performance out of your chosen LLM.

  5. Asynchronous Support: For developers who require high performance and non-blocking code execution, this package offers asynchronous methods.

Installation

You can install the connectlm package via pip:

pip install connectlm

Quickstart

Here's an example of how to connect to ChatGPT and use it to generate text:

import connectlm as cm

query = cm.QueryChat()
while (prompt := input("you : ")) != "exit":
    message = query.send(prompt)
    print(f"\n{message['role']} : {message['content']}\n")

Documentation

For detailed information on using this package, please refer to our documentation.

Contributing

We welcome contributions! Please see our contributing guidelines for details.

License

This project is licensed under the terms of the MIT license. See LICENSE for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ConnectLM-0.0.1.tar.gz (4.8 kB view details)

Uploaded Source

Built Distribution

ConnectLM-0.0.1-py3-none-any.whl (5.9 kB view details)

Uploaded Python 3

File details

Details for the file ConnectLM-0.0.1.tar.gz.

File metadata

  • Download URL: ConnectLM-0.0.1.tar.gz
  • Upload date:
  • Size: 4.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for ConnectLM-0.0.1.tar.gz
Algorithm Hash digest
SHA256 021f44a1db9d7496f89d2adfe0df3800660b2b6bf5f81bed06c64596980745ac
MD5 9b3ae05533b7327b8ef388d19570a7a4
BLAKE2b-256 d4b919ee228f671e7941eb80c143f2b337e2e08750584cc386df8e76370ed230

See more details on using hashes here.

File details

Details for the file ConnectLM-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: ConnectLM-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 5.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for ConnectLM-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 306fb63b65a17929ec67e44ed5ac3f49279d65a89bbcf6ae55fe2e3c60efd319
MD5 e0fa053783c204d87c25cabb6347d14b
BLAKE2b-256 50e242718311d3dbefe177685cd23d6ce599704dd4d6b67169c396aa067f39d9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page