Skip to main content

Production Ready LangChain

Project description

LongTrainer Logo

LongTrainer - Production-Ready LangChain

PyPI Version Total Downloads Monthly Downloads Open in Colab


Visit Blog Post

Star Follow @ENDEVSOLS

Official Documentation

Explore the comprehensive LongTrainer Documentation for detailed instructions on installation, features, and API usage.

Installation

Introducing LongTrainer, a sophisticated extension of the LangChain framework designed specifically for managing multiple bots and providing isolated, context-aware chat sessions. Ideal for developers and businesses looking to integrate complex conversational AI into their systems, LongTrainer simplifies the deployment and customization of LLMs.

pip install longtrainer

Installation Instructions for Required Libraries and Tools

1. Linux (Ubuntu/Debian)

To install the required packages on a Linux system (specifically Ubuntu or Debian), you can use the apt package manager. The following command installs several essential libraries and tools:

sudo apt install libmagic-dev poppler-utils tesseract-ocr qpdf libreoffice pandoc

2. macOS

On macOS, you can install these packages using brew, the Homebrew package manager. If you don't have Homebrew installed, you can install it from brew.sh.

brew install libmagic poppler tesseract qpdf libreoffice pandoc

Features 🌟

  • Long Memory: Retains context effectively for extended interactions.
  • Multi-Bot Management: Easily configure and manage multiple bots within a single framework, perfect for scaling across various use cases
  • Isolated Chat Sessions: Each bot operates within its own session, ensuring interactions remain distinct and contextually relevant without overlap.
  • Context-Aware Interactions: Utilize enhanced memory capabilities to maintain context over extended dialogues, significantly improving user experience
  • Scalable Architecture: Designed to scale effortlessly with your needs, whether you're handling hundreds of users or just a few.
  • Enhanced Customization: Tailor the behavior to fit specific needs.
  • Memory Management: Efficient handling of chat histories and contexts.
  • GPT Vision Support: Integration Context Aware GPT-powered visual models.
  • Different Data Formats: Supports various data input formats.
  • VectorStore Management: Advanced management of vector storage for efficient retrieval.

Diverse Use Cases:

  • Enterprise Solutions: Streamline customer interactions, automate responses, and manage multiple departmental bots from a single platform.
  • Educational Platforms: Enhance learning experiences with AI tutors capable of maintaining context throughout sessions.
  • Healthcare Applications: Support patient management with bots that provide consistent, context-aware interactions.

Works for All Langchain Supported LLM and Embeddings

  • ✅ OpenAI (default)
  • ✅ VertexAI
  • ✅ HuggingFace
  • ✅ AWS Bedrock
  • ✅ Groq
  • ✅ TogetherAI

Example

VertexAI LLMs

from longtrainer.trainer import LongTrainer
from langchain_community.llms import VertexAI

llm = VertexAI()

trainer = LongTrainer(mongo_endpoint='mongodb://localhost:27017/', llm=llm)

TogetherAI LLMs

from longtrainer.trainer import LongTrainer
from langchain_community.llms import Together

llm = Together(
    model="togethercomputer/RedPajama-INCITE-7B-Base",
    temperature=0.7,
    max_tokens=128,
    top_k=1,
    # together_api_key="..."
)

trainer = LongTrainer(mongo_endpoint='mongodb://localhost:27017/', llm=llm)

Usage Example 🚀

Here's a quick start guide on how to use LongTrainer:

from longtrainer.trainer import LongTrainer
import os

# Set your OpenAI API key
os.environ["OPENAI_API_KEY"] = "sk-"

# Initialize LongTrainer
trainer = LongTrainer(mongo_endpoint='mongodb://localhost:27017/', encrypt_chats=True)
bot_id = trainer.initialize_bot_id()
print('Bot ID: ', bot_id)

# Add Data
path = 'path/to/your/data'
trainer.add_document_from_path(path, bot_id)

# Initialize Bot
trainer.create_bot(bot_id)

# Start a New Chat
chat_id = trainer.new_chat(bot_id)

# Send a Query and Get a Response
query = 'Your query here'
response = trainer.get_response(query, bot_id, chat_id)
print('Response: ', response)

Here's a guide on how to use Vision Chat:

chat_id = trainer.new_vision_chat(bot_id)

query = 'Your query here'
image_paths = ['nvidia.jpg']
response = trainer.get_vision_response(query, image_paths, str(bot_id), str(vision_id))
print('Response: ', response)

List Chats and Display Chat History:

trainer.list_chats(bot_id)

trainer.get_chat_by_id(chat_id=chat_id)

This project is still under active development. Community feedback and contributions are highly appreciated.

Citation

If you utilize this repository, please consider citing it with:

@misc{longtrainer,
  author = {Endevsols},
  title = {LongTrainer: Production-Ready LangChain},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/ENDEVSOLS/Long-Trainer}},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

longtrainer-0.2.5.tar.gz (22.3 kB view details)

Uploaded Source

Built Distribution

longtrainer-0.2.5-py3-none-any.whl (21.4 kB view details)

Uploaded Python 3

File details

Details for the file longtrainer-0.2.5.tar.gz.

File metadata

  • Download URL: longtrainer-0.2.5.tar.gz
  • Upload date:
  • Size: 22.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.3

File hashes

Hashes for longtrainer-0.2.5.tar.gz
Algorithm Hash digest
SHA256 68c87d5a794c98e11abf980d8afe572adfe8d2fd2234d9df0d2050b8af2e8694
MD5 41a9558f31d01962ee08149806073466
BLAKE2b-256 11b4b4b6c4d8877c84ed70262ebc033a8cd014634c0c63255cb20dfb0df10786

See more details on using hashes here.

File details

Details for the file longtrainer-0.2.5-py3-none-any.whl.

File metadata

  • Download URL: longtrainer-0.2.5-py3-none-any.whl
  • Upload date:
  • Size: 21.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.3

File hashes

Hashes for longtrainer-0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 86c23d38ab2a619b7149d9e05bb6560d6356f5e6c15c36a92e5b13fb6f3812c8
MD5 3f6aef4dc9c310a08ed8aadd82b0a5e4
BLAKE2b-256 455d33d80f464f76e5f4dd34072024964eb6403f9bc5d66fd031a6eb2dbbf0ba

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page