Skip to main content

A LangChain integration for HuggingChat models

Project description

langchain_huggy

langchain_huggy is a Python package that provides an easy-to-use interface for interacting with HuggingChat models through the LangChain framework.

Available Models

langchain_huggy comes with several pre-configured models:

  1. 'meta-llama/Meta-Llama-3.1-70B-Instruct'
  2. 'CohereForAI/c4ai-command-r-plus-08-2024'
  3. 'Qwen/Qwen2.5-72B-Instruct'
  4. 'meta-llama/Llama-3.2-11B-Vision-Instruct'
  5. 'NousResearch/Hermes-3-Llama-3.1-8B'
  6. 'mistralai/Mistral-Nemo-Instruct-2407'
  7. 'microsoft/Phi-3.5-mini-instruct'

You can choose any of these models when initializing the HuggingChat instance.

Installation

Install the package using pip:

pip install langchain_huggy

Quick Start

Here's a simple example to get you started:

from langchain_huggy import HuggingChat

# Initialize the HuggingChat model
llm = HuggingChat(
    hf_email = "your_huggingface_email@example.com",
    hf_password = "your_huggingface_password",
    model = "Qwen/Qwen2.5-72B-Instruct"  # Optional: specify a model
)

# Stream the response to a question
llm.pstream("Who is Modi?")

This will print the streamed response to your question about Modi using the specified model.

Features

  • Easy integration with LangChain
  • Supports streaming responses
  • Uses HuggingChat models
  • Customizable with different model options

Configuration

You can configure the HuggingChat instance with the following parameters:

  • hf_email: Your HuggingFace account email
  • hf_password: Your HuggingFace account password
  • model: (Optional) Specify a particular model to use from the available models list

Viewing Available Models

You can view the list of available models at any time using:

print(llm.get_available_models)

Note

Make sure to keep your HuggingFace credentials secure and never share them in public repositories.

License

This project is licensed under the MIT License.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Support

If you encounter any problems or have any questions, please open an issue on the GitHub repository.

Happy chatting with langchain_huggy!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_huggy-0.1.4.tar.gz (4.7 kB view details)

Uploaded Source

Built Distribution

langchain_huggy-0.1.4-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file langchain_huggy-0.1.4.tar.gz.

File metadata

  • Download URL: langchain_huggy-0.1.4.tar.gz
  • Upload date:
  • Size: 4.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for langchain_huggy-0.1.4.tar.gz
Algorithm Hash digest
SHA256 28bbf9125734ae54d4c5383a262b5e01621b618ecf0bbaac0bf2abf7e72fdeb3
MD5 5f0b83e7494048f1eba46e150e87e2cd
BLAKE2b-256 a4e2819759909c316be06e1a927a8ee5c2d2e9b7d5ac42877cb57ecc90f417c9

See more details on using hashes here.

File details

Details for the file langchain_huggy-0.1.4-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_huggy-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 f26e36b43e10f6289b37d1967d807f4f1a4a206938fd3f4c079ff95c6f043168
MD5 28905160e03930d62467fe16aed0a5b5
BLAKE2b-256 f93b629ba66945b9a42d7f4240c1d1e5cb9ba3517ac21b2031e5050f075e6c8f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page