Skip to main content

Agente langchain con LLM

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

Sonika LangChain Bot PyPI Downloads

A Python library that implements a conversational agent using LangChain with tool execution capabilities and text classification.

Installation

pip install sonika-langchain-bot

Prerequisites

You'll need the following API keys:

  • OpenAI API Key

Create a .env file in the root of your project with the following variables:

OPENAI_API_KEY=your_api_key_here

Key Features

  • Conversational agent with tool execution capabilities
  • Text classification with structured output
  • Custom tool integration
  • Streaming responses
  • Conversation history management
  • Flexible instruction-based behavior

Basic Usage

Agent with Tools Example

import os
from dotenv import load_dotenv
from langchain_openai import OpenAIEmbeddings
from sonika_langchain_bot.langchain_tools import EmailTool
from sonika_langchain_bot.langchain_bot_agent import LangChainBot
from sonika_langchain_bot.langchain_class import Message, ResponseModel
from sonika_langchain_bot.langchain_models import OpenAILanguageModel

# Load environment variables
load_dotenv()

# Get API key from .env file
api_key = os.getenv("OPENAI_API_KEY")

# Initialize language model and embeddings
language_model = OpenAILanguageModel(api_key, model_name='gpt-4o-mini-2024-07-18', temperature=1)
embeddings = OpenAIEmbeddings(api_key=api_key)

# Configure tools
tools = [EmailTool()]

# Create agent instance
bot = LangChainBot(language_model, embeddings, instructions="You are an agent", tools=tools)

# Load conversation history
bot.load_conversation_history([Message(content="My name is Erley", is_bot=False)])

# Get response
user_message = 'Send an email with the tool to erley@gmail.com with subject Hello and message Hello Erley'
response_model: ResponseModel = bot.get_response(user_message)

print(response_model)

Streaming Response Example

import os
from dotenv import load_dotenv
from langchain_openai import OpenAIEmbeddings
from sonika_langchain_bot.langchain_bot_agent import LangChainBot
from sonika_langchain_bot.langchain_class import Message
from sonika_langchain_bot.langchain_models import OpenAILanguageModel

# Load environment variables
load_dotenv()

# Get API key from .env file
api_key = os.getenv("OPENAI_API_KEY")

# Initialize language model and embeddings
language_model = OpenAILanguageModel(api_key, model_name='gpt-4o-mini-2024-07-18', temperature=1)
embeddings = OpenAIEmbeddings(api_key=api_key)

# Create agent instance
bot = LangChainBot(language_model, embeddings, instructions="Only answers in english", tools=[])

# Load conversation history
bot.load_conversation_history([Message(content="My name is Erley", is_bot=False)])

# Get streaming response
user_message = 'Hello, what is my name?'
for chunk in bot.get_response_stream(user_message):
    print(chunk)

Text Classification Example

import os
from dotenv import load_dotenv
from sonika_langchain_bot.langchain_clasificator import TextClassifier
from sonika_langchain_bot.langchain_models import OpenAILanguageModel
from pydantic import BaseModel, Field

# Load environment variables
load_dotenv()

# Define classification structure with Pydantic
class Classification(BaseModel):
    intention: str = Field()
    sentiment: str = Field(..., enum=["happy", "neutral", "sad", "excited"])
    aggressiveness: int = Field(
        ...,
        description="describes how aggressive the statement is, the higher the number the more aggressive",
        enum=[1, 2, 3, 4, 5],
    )
    language: str = Field(
        ..., enum=["spanish", "english", "french", "german", "italian"]
    )

# Initialize classifier
api_key = os.getenv("OPENAI_API_KEY")
model = OpenAILanguageModel(api_key=api_key)
classifier = TextClassifier(llm=model, validation_class=Classification)

# Classify text
result = classifier.classify("how are you?")
print(result)

Available Classes and Components

Core Classes

  • LangChainBot: Main conversational agent for task execution with tools
  • OpenAILanguageModel: Wrapper for OpenAI language models
  • TextClassifier: Text classification using structured output
  • Message: Message structure for conversation history
  • ResponseModel: Response structure from agent interactions

Tools

  • EmailTool: Tool for sending emails through the agent

Project Structure

your_project/
├── .env                    # Environment variables
├── src/
│   └── sonika_langchain_bot/
│       ├── langchain_bot_agent.py
│       ├── langchain_clasificator.py
│       ├── langchain_class.py
│       ├── langchain_models.py
│       └── langchain_tools.py
└── tests/
    └── test_bot.py

Contributing

Contributions are welcome. Please open an issue to discuss major changes you'd like to make.

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sonika_langchain_bot-0.0.16.tar.gz (19.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sonika_langchain_bot-0.0.16-py3-none-any.whl (22.3 kB view details)

Uploaded Python 3

File details

Details for the file sonika_langchain_bot-0.0.16.tar.gz.

File metadata

  • Download URL: sonika_langchain_bot-0.0.16.tar.gz
  • Upload date:
  • Size: 19.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.2

File hashes

Hashes for sonika_langchain_bot-0.0.16.tar.gz
Algorithm Hash digest
SHA256 ae31e8440a8957c903cd4d850251a8ecf20dc266151480c904ce4cc6b181c8a2
MD5 bbcb581c4c7bb2087df5f5ebfc5006f1
BLAKE2b-256 4f9ea880e9f11ee6a2b403dd1f838b59641f2a1750e2efc25d5add8eaf82f470

See more details on using hashes here.

File details

Details for the file sonika_langchain_bot-0.0.16-py3-none-any.whl.

File metadata

File hashes

Hashes for sonika_langchain_bot-0.0.16-py3-none-any.whl
Algorithm Hash digest
SHA256 e4b899cf7ebcc240783e5c849af6423bacfa695cd2d77e42372d23b3b5374b40
MD5 9b0aee4ee8baf96d7904baa3dfe343b0
BLAKE2b-256 bc379d1d36ea72479585652f78df0d23066b3e66cba81e7e67484148a8043ff6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page