A LangChain integration for HuggingChat models
Project description
langchain_huggy
langchain_huggy is a Python package that provides an easy-to-use interface for interacting with HuggingChat models through the LangChain framework.
Available Models
langchain_huggy comes with several pre-configured models:
- 'meta-llama/Meta-Llama-3.1-70B-Instruct'
- 'CohereForAI/c4ai-command-r-plus-08-2024'
- 'Qwen/Qwen2.5-72B-Instruct'
- 'meta-llama/Llama-3.2-11B-Vision-Instruct'
- 'NousResearch/Hermes-3-Llama-3.1-8B'
- 'mistralai/Mistral-Nemo-Instruct-2407'
- 'microsoft/Phi-3.5-mini-instruct'
You can choose any of these models when initializing the HuggingChat instance.
Installation
Install the package using pip:
pip install langchain_huggy
Quick Start
Here's a simple example to get you started:
from langchain_huggy import HuggingChat
# Initialize the HuggingChat model
llm = HuggingChat(
hf_email = "your_huggingface_email@example.com",
hf_password = "your_huggingface_password",
model = "Qwen/Qwen2.5-72B-Instruct" # Optional: specify a model
)
# Stream the response to a question
llm.pstream("Who is Modi?")
This will print the streamed response to your question about Modi using the specified model.
Features
- Easy integration with LangChain
- Supports streaming responses
- Uses HuggingChat models
- Customizable with different model options
Configuration
You can configure the HuggingChat instance with the following parameters:
hf_email
: Your HuggingFace account emailhf_password
: Your HuggingFace account passwordmodel
: (Optional) Specify a particular model to use from the available models list
Viewing Available Models
You can view the list of available models at any time using:
print(llm.get_available_models)
Note
Make sure to keep your HuggingFace credentials secure and never share them in public repositories.
License
This project is licensed under the MIT License.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Support
If you encounter any problems or have any questions, please open an issue on the GitHub repository.
Happy chatting with langchain_huggy!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file langchain_huggy-0.1.3.tar.gz
.
File metadata
- Download URL: langchain_huggy-0.1.3.tar.gz
- Upload date:
- Size: 4.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ea8d04dcccf8709e46c6c5ca7082c7542e68af98a4b9e979c1708ae03eae5992 |
|
MD5 | 7812a7baf648f49e5bd1fd5401f63819 |
|
BLAKE2b-256 | b389253b380ea3f3c366d2d646fd5cf9b190b58fecdc28b2058463c9532546a5 |
File details
Details for the file langchain_huggy-0.1.3-py3-none-any.whl
.
File metadata
- Download URL: langchain_huggy-0.1.3-py3-none-any.whl
- Upload date:
- Size: 5.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7bcf8ddf944bf91afd62ffe9f9891e1ed82c5a7ac23a883fe9b56b8b2de7ed9f |
|
MD5 | 66738d5ea66dcc81bb37cf737488387b |
|
BLAKE2b-256 | 9e2f9417ca0ed77d6580da11352c60462d89a4dcd6a98a88690d4bf4e8a82919 |