Skip to main content

Agente langchain con LLM

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

Sonika LangChain Bot PyPI Downloads

A Python library that implements a conversational agent using LangChain with tool execution capabilities and text classification.

Installation

pip install sonika-langchain-bot

Prerequisites

You'll need the following API keys:

  • OpenAI API Key

Create a .env file in the root of your project with the following variables:

OPENAI_API_KEY=your_api_key_here

Key Features

  • Conversational agent with tool execution capabilities
  • Text classification with structured output
  • Custom tool integration
  • Streaming responses
  • Conversation history management
  • Flexible instruction-based behavior

Basic Usage

Agent with Tools Example

import os
from dotenv import load_dotenv
from langchain_openai import OpenAIEmbeddings
from sonika_langchain_bot.langchain_tools import EmailTool
from sonika_langchain_bot.langchain_bot_agent import LangChainBot
from sonika_langchain_bot.langchain_class import Message, ResponseModel
from sonika_langchain_bot.langchain_models import OpenAILanguageModel

# Load environment variables
load_dotenv()

# Get API key from .env file
api_key = os.getenv("OPENAI_API_KEY")

# Initialize language model and embeddings
language_model = OpenAILanguageModel(api_key, model_name='gpt-4o-mini-2024-07-18', temperature=1)
embeddings = OpenAIEmbeddings(api_key=api_key)

# Configure tools
tools = [EmailTool()]

# Create agent instance
bot = LangChainBot(language_model, embeddings, instructions="You are an agent", tools=tools)

# Load conversation history
bot.load_conversation_history([Message(content="My name is Erley", is_bot=False)])

# Get response
user_message = 'Send an email with the tool to erley@gmail.com with subject Hello and message Hello Erley'
response_model: ResponseModel = bot.get_response(user_message)

print(response_model)

Streaming Response Example

import os
from dotenv import load_dotenv
from langchain_openai import OpenAIEmbeddings
from sonika_langchain_bot.langchain_bot_agent import LangChainBot
from sonika_langchain_bot.langchain_class import Message
from sonika_langchain_bot.langchain_models import OpenAILanguageModel

# Load environment variables
load_dotenv()

# Get API key from .env file
api_key = os.getenv("OPENAI_API_KEY")

# Initialize language model and embeddings
language_model = OpenAILanguageModel(api_key, model_name='gpt-4o-mini-2024-07-18', temperature=1)
embeddings = OpenAIEmbeddings(api_key=api_key)

# Create agent instance
bot = LangChainBot(language_model, embeddings, instructions="Only answers in english", tools=[])

# Load conversation history
bot.load_conversation_history([Message(content="My name is Erley", is_bot=False)])

# Get streaming response
user_message = 'Hello, what is my name?'
for chunk in bot.get_response_stream(user_message):
    print(chunk)

Text Classification Example

import os
from dotenv import load_dotenv
from sonika_langchain_bot.langchain_clasificator import TextClassifier
from sonika_langchain_bot.langchain_models import OpenAILanguageModel
from pydantic import BaseModel, Field

# Load environment variables
load_dotenv()

# Define classification structure with Pydantic
class Classification(BaseModel):
    intention: str = Field()
    sentiment: str = Field(..., enum=["happy", "neutral", "sad", "excited"])
    aggressiveness: int = Field(
        ...,
        description="describes how aggressive the statement is, the higher the number the more aggressive",
        enum=[1, 2, 3, 4, 5],
    )
    language: str = Field(
        ..., enum=["spanish", "english", "french", "german", "italian"]
    )

# Initialize classifier
api_key = os.getenv("OPENAI_API_KEY")
model = OpenAILanguageModel(api_key=api_key)
classifier = TextClassifier(llm=model, validation_class=Classification)

# Classify text
result = classifier.classify("how are you?")
print(result)

Available Classes and Components

Core Classes

  • LangChainBot: Main conversational agent for task execution with tools
  • OpenAILanguageModel: Wrapper for OpenAI language models
  • TextClassifier: Text classification using structured output
  • Message: Message structure for conversation history
  • ResponseModel: Response structure from agent interactions

Tools

  • EmailTool: Tool for sending emails through the agent

Project Structure

your_project/
├── .env                    # Environment variables
├── src/
│   └── sonika_langchain_bot/
│       ├── langchain_bot_agent.py
│       ├── langchain_clasificator.py
│       ├── langchain_class.py
│       ├── langchain_models.py
│       └── langchain_tools.py
└── tests/
    └── test_bot.py

Contributing

Contributions are welcome. Please open an issue to discuss major changes you'd like to make.

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sonika_langchain_bot-0.0.59.tar.gz (78.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sonika_langchain_bot-0.0.59-py3-none-any.whl (103.9 kB view details)

Uploaded Python 3

File details

Details for the file sonika_langchain_bot-0.0.59.tar.gz.

File metadata

  • Download URL: sonika_langchain_bot-0.0.59.tar.gz
  • Upload date:
  • Size: 78.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.2

File hashes

Hashes for sonika_langchain_bot-0.0.59.tar.gz
Algorithm Hash digest
SHA256 083ae5eb78a72d7a31dfd31a03ee7a55e390724c09ceefeb6f806163aa2d4259
MD5 21dd9e502f99b3208262cf1ac9478786
BLAKE2b-256 382266408ee0de1ce9108a47d43cc404bfc21b5ae449ee66aee8994e458edfb6

See more details on using hashes here.

File details

Details for the file sonika_langchain_bot-0.0.59-py3-none-any.whl.

File metadata

File hashes

Hashes for sonika_langchain_bot-0.0.59-py3-none-any.whl
Algorithm Hash digest
SHA256 0dc3eac2130ad83881a1d924a8deea24f4ee85abc62e784be074d0806f709eaa
MD5 a5227dd873fae82f11a8a0aae19a3fe2
BLAKE2b-256 ce7849f759705cf71e62b68b3a8beca26d7ff24e92873ffc678438ef55d39e1d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page