Skip to main content

Custom Langchain chat models for various large model service providers

Project description

Langchain Custom Models

This repository provides a collection of custom ChatModel integrations for Langchain, enabling support for various Large Language Model (LLM) service providers. The goal is to offer a unified interface for models that are not yet officially supported by Langchain.

Currently, the following provider is supported:

  • Volcengine Ark

Features

  • Seamless Integration: Drop-in replacement for any Langchain ChatModel.
  • Volcengine Ark Support: Full support for ChatVolcEngine to interact with Volcengine's Ark API.
  • Standard Interface: Works with standard Langchain message types (SystemMessage, HumanMessage, AIMessage, ToolMessage).
  • Tool Binding Support: Full support for bind_tools() method for function calling capabilities.

Installation

You can install the package directly from this repository:

pip install git+https://github.com/Hellozaq/langchain-custom-models.git

Additionally, ensure you have the Volcengine Ark SDK installed:

pip install volcengine-python-sdk[ark]

Usage

ChatVolcEngine

To use the ChatVolcEngine model, you need to provide your Volcengine Ark API key. The recommended approach is to use a .env file to manage your credentials securely.

1. Create a .env file

In your project's root directory, create a file named .env and add your API key:

VOLCANO_API_KEY="your-ark-api-key"

2. Load Credentials and Use the Model

Now, you can use dotenv to load the API key from the .env file into your environment.

Here is a basic example:

from dotenv import load_dotenv
from langchain_custom_models import ChatVolcEngine
from langchain_core.messages import HumanMessage, SystemMessage

# Load environment variables from .env file
load_dotenv()

# Initialize the chat model
# The API key is automatically read from the VOLCANO_API_KEY environment variable.
# Replace 'your-model-id' with the actual model ID from Volcengine Ark, e.g., 'deepseek-v3-250324'
llm = ChatVolcEngine(model="your-model-id")

# Prepare messages
messages = [
    SystemMessage(content="You are a helpful assistant."),
    HumanMessage(content="Hello, who are you?"),
]

# Get a response
response = llm.invoke(messages)

print(response.content)

Parameters

  • model (str): Required. The model ID from Volcengine Ark (e.g., deepseek-v3-250324).
  • ark_api_key (Optional[str]): Your Volcengine Ark API key. If not provided, it will be read from the VOLCANO_API_KEY environment variable.
  • max_tokens (int): The maximum number of tokens to generate. Defaults to 4096.
  • temperature (float): Controls the randomness of the output. Defaults to 0.7.
  • top_p (float): Nucleus sampling parameter. Defaults to 1.0.

Contributing

Contributions are welcome! If you would like to add support for a new LLM provider or improve existing integrations, please feel free to open a pull request.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_custom_models-0.1.3.tar.gz (6.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_custom_models-0.1.3-py3-none-any.whl (6.6 kB view details)

Uploaded Python 3

File details

Details for the file langchain_custom_models-0.1.3.tar.gz.

File metadata

  • Download URL: langchain_custom_models-0.1.3.tar.gz
  • Upload date:
  • Size: 6.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.4

File hashes

Hashes for langchain_custom_models-0.1.3.tar.gz
Algorithm Hash digest
SHA256 ce1d97cf1a79d3bbfdc9b19e194e25a953c1c8519934884e65501029d19f450c
MD5 86a36d898c8c2fe33207e8896cf4da03
BLAKE2b-256 8cc1582140111aef7ecd52fe02e7446499e0aa142271aec8224016ff2b6dbc4d

See more details on using hashes here.

File details

Details for the file langchain_custom_models-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_custom_models-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 65a38b10d848bd46ea3541e065262e0c9480bba83caabeea79339820340b1a36
MD5 40c050ba5db6f783b1a274fc5e52b388
BLAKE2b-256 c6253e7bf1004a222a62f6ae545532ee3f92a14a191a5db366f14462bee1ffd4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page