Skip to main content

LangChain integration for SeekrFlow chat models

Project description

langchain-seekrflow

langchain-seekrflow is an open-source integration package that wraps SeekrFlow’s chat model endpoints for seamless use with LangChain. This package provides a simple interface—via the ChatSeekrFlow class—to incorporate SeekrFlow's AI-powered chat models into your LangChain applications.

Note: This integration package is maintained independently from the core LangChain repository. Documentation for LangChain integrations continues to be maintained in the LangChain docs.


Features

  • LangChain Compatibility: Implements the LangChain BaseChatModel interface.
  • Streaming Support: Enables token-level streaming of responses.
  • Flexible Input Handling: Accepts strings, lists of messages, or dictionary inputs.
  • Easy Integration: Designed to work seamlessly with other LangChain components such as prompt templates and chain runners.

Installation

Ensure you have Python 3.10 (or compatible) installed. Then install via Poetry or pip:

Using pip (if published to PyPI)

pip install langchain-seekrflow

Using Poetry

Clone the repository and install dependencies:

git clone https://github.com/yourusername/langchain-seekrflow.git
cd langchain-seekrflow
poetry install

Usage

Below is a quick example demonstrating how to use ChatSeekrFlow:

from langchain.prompts import ChatPromptTemplate
from langchain.schema import HumanMessage
from langchain_core.runnables import RunnableSequence
from langchain_seekrflow import ChatSeekrFlow
from seekrai import SeekrFlow

# Set your Seekr API key
SEEKR_API_KEY = "your-api-key-here"

# Initialize the SeekrFlow client (from seekrai)
seekr_client = SeekrFlow(api_key=SEEKR_API_KEY)

# Instantiate the ChatSeekrFlow model (for non-streaming mode)
llm = ChatSeekrFlow(
    client=seekr_client, 
    model_name="meta-llama/Meta-Llama-3-8B-Instruct"
)

# Synchronous invocation example:
response = llm.invoke([HumanMessage(content="Hello, Seekr!")])
print("Response:", response.content)

# Chaining example:
prompt = ChatPromptTemplate.from_template("Translate to French: {text}")
chain: RunnableSequence = prompt | llm
result = chain.invoke({"text": "Good morning"})
print("Chained response:", result)

# Streaming example:
llm.streaming = True  # Enable streaming
for chunk in llm.stream([HumanMessage(content="Write me a haiku.")]):
    print(chunk.content, end="", flush=True)

Configuration & Requirements

  • API Key: You must have a valid Seekr API key to authenticate requests. Set it as an environment variable or pass it to the SeekrFlow client.
  • Model Endpoint: Ensure your model endpoint is compatible with OpenAI’s chat format. ChatSeekrFlow can be used with both fine-tuned and custom SeekrFlow models.
  • Dependencies: This package depends on langchain, seekrai, and other libraries specified in the pyproject.toml.

Contributing

Contributions are welcome! Please open issues or pull requests on GitHub to help improve the package. For guidelines, see our Contributing Guide.


License

This project is licensed under the MIT License. See the LICENSE file for details.


API Reference

LangChain maintains documentation for community integrations separately. Once this integration is added to their docs, you’ll be able to find ChatSeekrFlow in the LangChain integrations section.

Until then, refer to this repo’s code and examples for usage.


Happy coding!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_seekrflow-0.1.0.tar.gz (4.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_seekrflow-0.1.0-py3-none-any.whl (5.2 kB view details)

Uploaded Python 3

File details

Details for the file langchain_seekrflow-0.1.0.tar.gz.

File metadata

  • Download URL: langchain_seekrflow-0.1.0.tar.gz
  • Upload date:
  • Size: 4.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.1 CPython/3.10.12 Darwin/24.3.0

File hashes

Hashes for langchain_seekrflow-0.1.0.tar.gz
Algorithm Hash digest
SHA256 78892e7f3d4cbb9ffeefcae82f083f1eae6f16998c9ee9408668db6739a46d66
MD5 bd02cf9255f4a377821130d9b16fd9bb
BLAKE2b-256 6fd53d8d262628f72ee8709064b441b80b3793c1e4e3aa7427cfccd775d76712

See more details on using hashes here.

File details

Details for the file langchain_seekrflow-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_seekrflow-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a1fa35e11919d4f31dd2111b44d4e9b80e9ce04da90810394d9f046adfd4032c
MD5 f871e3331ef653ac90b79a49d916616a
BLAKE2b-256 da50a83ce71f6bfb808bac6ae7c2cb2e15028539a92bda39e36ce573c1e996c9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page