Skip to main content

Agente langchain con LLM

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

Sonika LangChain Bot PyPI Downloads

A Python library that implements a conversational agent using LangChain with tool execution capabilities and text classification.

Installation

pip install sonika-langchain-bot

Prerequisites

You'll need the following API keys:

  • OpenAI API Key

Create a .env file in the root of your project with the following variables:

OPENAI_API_KEY=your_api_key_here

Key Features

  • Conversational agent with tool execution capabilities
  • Text classification with structured output
  • Custom tool integration
  • Streaming responses
  • Conversation history management
  • Flexible instruction-based behavior

Basic Usage

Agent with Tools Example

import os
from dotenv import load_dotenv
from langchain_openai import OpenAIEmbeddings
from sonika_langchain_bot.langchain_tools import EmailTool
from sonika_langchain_bot.langchain_bot_agent import LangChainBot
from sonika_langchain_bot.langchain_class import Message, ResponseModel
from sonika_langchain_bot.langchain_models import OpenAILanguageModel

# Load environment variables
load_dotenv()

# Get API key from .env file
api_key = os.getenv("OPENAI_API_KEY")

# Initialize language model and embeddings
language_model = OpenAILanguageModel(api_key, model_name='gpt-4o-mini-2024-07-18', temperature=1)
embeddings = OpenAIEmbeddings(api_key=api_key)

# Configure tools
tools = [EmailTool()]

# Create agent instance
bot = LangChainBot(language_model, embeddings, instructions="You are an agent", tools=tools)

# Load conversation history
bot.load_conversation_history([Message(content="My name is Erley", is_bot=False)])

# Get response
user_message = 'Send an email with the tool to erley@gmail.com with subject Hello and message Hello Erley'
response_model: ResponseModel = bot.get_response(user_message)

print(response_model)

Streaming Response Example

import os
from dotenv import load_dotenv
from langchain_openai import OpenAIEmbeddings
from sonika_langchain_bot.langchain_bot_agent import LangChainBot
from sonika_langchain_bot.langchain_class import Message
from sonika_langchain_bot.langchain_models import OpenAILanguageModel

# Load environment variables
load_dotenv()

# Get API key from .env file
api_key = os.getenv("OPENAI_API_KEY")

# Initialize language model and embeddings
language_model = OpenAILanguageModel(api_key, model_name='gpt-4o-mini-2024-07-18', temperature=1)
embeddings = OpenAIEmbeddings(api_key=api_key)

# Create agent instance
bot = LangChainBot(language_model, embeddings, instructions="Only answers in english", tools=[])

# Load conversation history
bot.load_conversation_history([Message(content="My name is Erley", is_bot=False)])

# Get streaming response
user_message = 'Hello, what is my name?'
for chunk in bot.get_response_stream(user_message):
    print(chunk)

Text Classification Example

import os
from dotenv import load_dotenv
from sonika_langchain_bot.langchain_clasificator import TextClassifier
from sonika_langchain_bot.langchain_models import OpenAILanguageModel
from pydantic import BaseModel, Field

# Load environment variables
load_dotenv()

# Define classification structure with Pydantic
class Classification(BaseModel):
    intention: str = Field()
    sentiment: str = Field(..., enum=["happy", "neutral", "sad", "excited"])
    aggressiveness: int = Field(
        ...,
        description="describes how aggressive the statement is, the higher the number the more aggressive",
        enum=[1, 2, 3, 4, 5],
    )
    language: str = Field(
        ..., enum=["spanish", "english", "french", "german", "italian"]
    )

# Initialize classifier
api_key = os.getenv("OPENAI_API_KEY")
model = OpenAILanguageModel(api_key=api_key)
classifier = TextClassifier(llm=model, validation_class=Classification)

# Classify text
result = classifier.classify("how are you?")
print(result)

Available Classes and Components

Core Classes

  • LangChainBot: Main conversational agent for task execution with tools
  • OpenAILanguageModel: Wrapper for OpenAI language models
  • TextClassifier: Text classification using structured output
  • Message: Message structure for conversation history
  • ResponseModel: Response structure from agent interactions

Tools

  • EmailTool: Tool for sending emails through the agent

Project Structure

your_project/
├── .env                    # Environment variables
├── src/
│   └── sonika_langchain_bot/
│       ├── langchain_bot_agent.py
│       ├── langchain_clasificator.py
│       ├── langchain_class.py
│       ├── langchain_models.py
│       └── langchain_tools.py
└── tests/
    └── test_bot.py

Contributing

Contributions are welcome. Please open an issue to discuss major changes you'd like to make.

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sonika_langchain_bot-0.0.49.tar.gz (53.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sonika_langchain_bot-0.0.49-py3-none-any.whl (69.3 kB view details)

Uploaded Python 3

File details

Details for the file sonika_langchain_bot-0.0.49.tar.gz.

File metadata

  • Download URL: sonika_langchain_bot-0.0.49.tar.gz
  • Upload date:
  • Size: 53.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.2

File hashes

Hashes for sonika_langchain_bot-0.0.49.tar.gz
Algorithm Hash digest
SHA256 ee2c2ebedfef200ae44999df58602ff5e7fe6f614041085ebf5395db1c84436e
MD5 2544233aa942de1758f0d1739083e148
BLAKE2b-256 a4fa5dd140ea47ddacaad38b82915674908d085c60c9c9fb39f19c495e9a0487

See more details on using hashes here.

File details

Details for the file sonika_langchain_bot-0.0.49-py3-none-any.whl.

File metadata

File hashes

Hashes for sonika_langchain_bot-0.0.49-py3-none-any.whl
Algorithm Hash digest
SHA256 282d6d37199dcca16e4b1bfcaa9634809c0ff7a2984436f1c7729aa016a12ae5
MD5 b9f4e5174d6da3677b6410852a92e318
BLAKE2b-256 c1cb596269f8f09ccf09f0ad57db4addab6332757bf6e3e2ca9cf5ff0780a95f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page