Skip to main content

Custom Langchain chat models for various large model service providers

Project description

Langchain Custom Models

This repository provides a collection of custom ChatModel integrations for Langchain, enabling support for various Large Language Model (LLM) service providers. The goal is to offer a unified interface for models that are not yet officially supported by Langchain.

Currently, the following provider is supported:

  • Volcengine Ark

Features

  • Seamless Integration: Drop-in replacement for any Langchain ChatModel.
  • Volcengine Ark Support: Full support for ChatVolcEngine to interact with Volcengine's Ark API.
  • Standard Interface: Works with standard Langchain message types (SystemMessage, HumanMessage, AIMessage, ToolMessage).
  • Tool Binding Support: Full support for bind_tools() method for function calling capabilities.

Installation

You can install the package directly from this repository:

pip install git+https://github.com/Hellozaq/langchain-custom-models.git

Additionally, ensure you have the Volcengine Ark SDK installed:

pip install volcengine-python-sdk[ark]

Usage

ChatVolcEngine

To use the ChatVolcEngine model, you need to provide your Volcengine Ark API key. The recommended approach is to use a .env file to manage your credentials securely.

1. Create a .env file

In your project's root directory, create a file named .env and add your API key:

VOLCANO_API_KEY="your-ark-api-key"

2. Load Credentials and Use the Model

Now, you can use dotenv to load the API key from the .env file into your environment.

Here is a basic example:

from dotenv import load_dotenv
from langchain_custom_models import ChatVolcEngine
from langchain_core.messages import HumanMessage, SystemMessage

# Load environment variables from .env file
load_dotenv()

# Initialize the chat model
# The API key is automatically read from the VOLCANO_API_KEY environment variable.
# Replace 'your-model-id' with the actual model ID from Volcengine Ark, e.g., 'deepseek-v3-250324'
llm = ChatVolcEngine(model="your-model-id")

# Prepare messages
messages = [
    SystemMessage(content="You are a helpful assistant."),
    HumanMessage(content="Hello, who are you?"),
]

# Get a response
response = llm.invoke(messages)

print(response.content)

Parameters

  • model (str): Required. The model ID from Volcengine Ark (e.g., deepseek-v3-250324).
  • ark_api_key (Optional[str]): Your Volcengine Ark API key. If not provided, it will be read from the VOLCANO_API_KEY environment variable.
  • max_tokens (int): The maximum number of tokens to generate. Defaults to 4096.
  • temperature (float): Controls the randomness of the output. Defaults to 0.7.
  • top_p (float): Nucleus sampling parameter. Defaults to 1.0.

Contributing

Contributions are welcome! If you would like to add support for a new LLM provider or improve existing integrations, please feel free to open a pull request.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_custom_models-0.1.4.tar.gz (6.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_custom_models-0.1.4-py3-none-any.whl (6.7 kB view details)

Uploaded Python 3

File details

Details for the file langchain_custom_models-0.1.4.tar.gz.

File metadata

  • Download URL: langchain_custom_models-0.1.4.tar.gz
  • Upload date:
  • Size: 6.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.4

File hashes

Hashes for langchain_custom_models-0.1.4.tar.gz
Algorithm Hash digest
SHA256 bc398d87d070d11174cbcfccec2d3b06884ab81ecc40ee50eff4db88487e516c
MD5 6613d8b9772087ec09c9dd93af466277
BLAKE2b-256 38619b55957be3e3ae0be5e95a2f246466032bf04d7118f2b10486c19057f3a3

See more details on using hashes here.

File details

Details for the file langchain_custom_models-0.1.4-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_custom_models-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 97df238521f9954f2a2d6dc167e77bb7cc1b5bea3d6ad31b19af2e8409797e6b
MD5 bccf3dc5dc64bf44634dcd94177f299f
BLAKE2b-256 0742577cd9713f26f4f71541ad5c1706170f1f792ecdfb527ff1260106ec53b5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page