Skip to main content

A LangChain integration for HuggingChat models

Project description

langchain_huggy

langchain_huggy is a Python package that provides an easy-to-use interface for interacting with HuggingChat models through the LangChain framework. It allows you to leverage powerful language models in your applications with minimal setup.

Available Models

langchain_huggy comes with several pre-configured models:

  1. 'meta-llama/Meta-Llama-3.1-70B-Instruct'
  2. 'CohereForAI/c4ai-command-r-plus-08-2024'
  3. 'Qwen/Qwen2.5-72B-Instruct' (default)
  4. 'meta-llama/Llama-3.2-11B-Vision-Instruct'
  5. 'NousResearch/Hermes-3-Llama-3.1-8B'
  6. 'mistralai/Mistral-Nemo-Instruct-2407'
  7. 'microsoft/Phi-3.5-mini-instruct'

You can choose any of these models when initializing the HuggingChat instance.

Installation

Install the package using pip:

pip install langchain_huggy

Quick Start

Here's a simple example to get you started:

from langchain_huggy import HuggingChat
from langchain_core.messages import HumanMessage

# Initialize the HuggingChat model
llm = HuggingChat(
    hf_email="your_huggingface_email@example.com",
    hf_password="your_huggingface_password",
    model="Qwen/Qwen2.5-72B-Instruct"  # Optional: specify a model
)

# Generate a response
response = llm.invoke("hi!")
print(response.content)

# Stream a response
llm.stream("Tell me a short story about a robot.")

# Get web Search results set web_search = True
llm.invoke("latest climate news",web_search = True)
llm.stream("Tell me a short story about a robot.",web_search = True)

Features

  • Easy integration with LangChain
  • WebSearch Hurray!!!
  • Support for multiple HuggingChat models
  • Built-in error handling and type checking

Configuration

You can configure the HuggingChat instance with the following parameters:

  • hf_email: Your HuggingFace account email
  • hf_password: Your HuggingFace account password
  • model: (Optional) Specify a particular model to use from the available models list

Available Methods

  • invoke: Generate a complete response for given input
  • generate: Generate a ChatResult object (compatible with LangChain)
  • stream: Stream the response as an iterator of message chunks
  • pstream: Print the streamed response directly to console

Viewing Available Models

You can view the list of available models at any time using:

print(llm.get_available_models)

Error Handling

The package includes built-in error handling. If you encounter any issues during streaming or generation, informative error messages will be printed to the console.

Note on Credentials

Make sure to keep your HuggingFace credentials secure. You can set them as environment variables:

export HUGGINGFACE_EMAIL="your_email@example.com"
export HUGGINGFACE_PASSWD="your_password"

Never share your credentials in public repositories or include them directly in your code.

License

This project is licensed under the MIT License.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Support

If you encounter any problems or have any questions, please open an issue on the GitHub repository.

Happy chatting with langchain_huggy!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_huggy-0.2.6.tar.gz (9.8 kB view details)

Uploaded Source

Built Distribution

langchain_huggy-0.2.6-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file langchain_huggy-0.2.6.tar.gz.

File metadata

  • Download URL: langchain_huggy-0.2.6.tar.gz
  • Upload date:
  • Size: 9.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for langchain_huggy-0.2.6.tar.gz
Algorithm Hash digest
SHA256 569d902229efe808c51eef0d5638acd71e41f9acf7965d26d7ccbe5a968bda69
MD5 058c7ee0e6c71f000ffd8cdee8e771de
BLAKE2b-256 6c04fec4f287893034a4e5f3d2b313a4b8e2a166b5c545cd75d0ad56ef72c712

See more details on using hashes here.

File details

Details for the file langchain_huggy-0.2.6-py3-none-any.whl.

File metadata

  • Download URL: langchain_huggy-0.2.6-py3-none-any.whl
  • Upload date:
  • Size: 10.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for langchain_huggy-0.2.6-py3-none-any.whl
Algorithm Hash digest
SHA256 52966b64d04f5d7dbe64215deba857374c4f43cba7fc13876ddb1011b205bc9f
MD5 99df1a6950aa72b4a9adf8e62457aefd
BLAKE2b-256 0257ca51f6ec9b86ab5af60c445a0e9cf11a1c6bdd8d53a8c0533beb04699614

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page