Skip to main content

A LangChain integration for HuggingChat models

Project description

langchain_huggy

langchain_huggy is a Python package that provides an easy-to-use interface for interacting with HuggingChat models through the LangChain framework.

Available Models

langchain_huggy comes with several pre-configured models:

  1. 'meta-llama/Meta-Llama-3.1-70B-Instruct'
  2. 'CohereForAI/c4ai-command-r-plus-08-2024'
  3. 'Qwen/Qwen2.5-72B-Instruct'
  4. 'meta-llama/Llama-3.2-11B-Vision-Instruct'
  5. 'NousResearch/Hermes-3-Llama-3.1-8B'
  6. 'mistralai/Mistral-Nemo-Instruct-2407'
  7. 'microsoft/Phi-3.5-mini-instruct'

You can choose any of these models when initializing the HuggingChat instance.

Installation

Install the package using pip:

pip install langchain_huggy

Quick Start

Here's a simple example to get you started:

from langchain_huggy import HuggingChat

# Initialize the HuggingChat model
llm = HuggingChat(
    hf_email = "your_huggingface_email@example.com",
    hf_password = "your_huggingface_password",
    model = "Qwen/Qwen2.5-72B-Instruct"  # Optional: specify a model
)

# Stream the response to a question
llm.pstream("Who is Modi?")

This will print the streamed response to your question about Modi using the specified model.

Features

  • Easy integration with LangChain
  • Supports streaming responses
  • Uses HuggingChat models
  • Customizable with different model options

Configuration

You can configure the HuggingChat instance with the following parameters:

  • hf_email: Your HuggingFace account email
  • hf_password: Your HuggingFace account password
  • model: (Optional) Specify a particular model to use from the available models list

Viewing Available Models

You can view the list of available models at any time using:

print(llm.get_available_models)

Note

Make sure to keep your HuggingFace credentials secure and never share them in public repositories.

License

This project is licensed under the MIT License.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Support

If you encounter any problems or have any questions, please open an issue on the GitHub repository.

Happy chatting with langchain_huggy!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_huggy-0.1.3.tar.gz (4.6 kB view details)

Uploaded Source

Built Distribution

langchain_huggy-0.1.3-py3-none-any.whl (5.0 kB view details)

Uploaded Python 3

File details

Details for the file langchain_huggy-0.1.3.tar.gz.

File metadata

  • Download URL: langchain_huggy-0.1.3.tar.gz
  • Upload date:
  • Size: 4.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for langchain_huggy-0.1.3.tar.gz
Algorithm Hash digest
SHA256 ea8d04dcccf8709e46c6c5ca7082c7542e68af98a4b9e979c1708ae03eae5992
MD5 7812a7baf648f49e5bd1fd5401f63819
BLAKE2b-256 b389253b380ea3f3c366d2d646fd5cf9b190b58fecdc28b2058463c9532546a5

See more details on using hashes here.

File details

Details for the file langchain_huggy-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_huggy-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 7bcf8ddf944bf91afd62ffe9f9891e1ed82c5a7ac23a883fe9b56b8b2de7ed9f
MD5 66738d5ea66dcc81bb37cf737488387b
BLAKE2b-256 9e2f9417ca0ed77d6580da11352c60462d89a4dcd6a98a88690d4bf4e8a82919

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page