Language Model Proxy
Project description
ConnectLM
The goal of ConnectLM is providing python library that enables developers to effortlessly interface with various LLMs such as OpenAI's GPT-4, Google's BERT, and others.
This package encapsulates the complex interactions with these LLMs behind intuitive and user-friendly proxy classes, allowing developers to focus on leveraging the power of these advanced models, without being bogged down by intricate API details and interfacing code.
Features (WIP)
-
Multiple LLM Support: The package supports multiple LLMs, including but not limited to GPT-4, BERT, and others. It is also designed with flexibility in mind to accommodate future models.
-
Uniform API: It provides a uniform API for different LLMs, allowing developers to switch between models without the need to extensively change the codebase.
-
Error Handling: It includes built-in error handling and retry logic, ensuring that your application remains resilient against minor network hiccups and transient errors.
-
Optimized Performance: With caching and other optimizations, this package makes sure you get the best possible performance out of your chosen LLM.
-
Asynchronous Support: For developers who require high performance and non-blocking code execution, this package offers asynchronous methods.
Installation
You can install the connectlm package via pip:
pip install connectlm
Quickstart
Here's an example of how to connect to ChatGPT and use it to generate text:
import connectlm as cm
query = cm.QueryChat()
while (prompt := input("you : ")) != "exit":
message = query.send(prompt)
print(f"\n{message['role']} : {message['content']}\n")
Documentation
For detailed information on using this package, please refer to our documentation.
Contributing
We welcome contributions! Please see our contributing guidelines for details.
License
This project is licensed under the terms of the MIT license. See LICENSE for more details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ConnectLM-0.0.1.tar.gz.
File metadata
- Download URL: ConnectLM-0.0.1.tar.gz
- Upload date:
- Size: 4.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
021f44a1db9d7496f89d2adfe0df3800660b2b6bf5f81bed06c64596980745ac
|
|
| MD5 |
9b3ae05533b7327b8ef388d19570a7a4
|
|
| BLAKE2b-256 |
d4b919ee228f671e7941eb80c143f2b337e2e08750584cc386df8e76370ed230
|
File details
Details for the file ConnectLM-0.0.1-py3-none-any.whl.
File metadata
- Download URL: ConnectLM-0.0.1-py3-none-any.whl
- Upload date:
- Size: 5.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
306fb63b65a17929ec67e44ed5ac3f49279d65a89bbcf6ae55fe2e3c60efd319
|
|
| MD5 |
e0fa053783c204d87c25cabb6347d14b
|
|
| BLAKE2b-256 |
50e242718311d3dbefe177685cd23d6ce599704dd4d6b67169c396aa067f39d9
|