Skip to main content

Agente langchain con LLM

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

Sonika LangChain Bot

Una librería Python que implementa un bot conversacional utilizando LangChain con capacidades BDI (Belief-Desire-Intention) y clasificación de texto.

Instalación

pip install sonika-langchain-bot

Requisitos previos

Necesitarás las siguientes API keys:

  • OpenAI API Key

Crea un archivo .env en la raíz de tu proyecto con las siguientes variables:

OPENAI_API_KEY=tu_api_key_aqui

Características principales

  • Bot conversacional con arquitectura BDI
  • Clasificación de texto
  • Ejecución de código personalizado por medio de tools

Uso básico

Ejemplo de Bot BDI

from sonika_langchain_bot.langchain_bdi import Belief, BeliefType
from sonika_langchain_bot.langchain_bot_agent_bdi import LangChainBot
from sonika_langchain_bot.langchain_models import OpenAILanguageModel
from langchain_openai import OpenAIEmbeddings

# Inicializar el modelo de lenguaje
language_model = OpenAILanguageModel(api_key, model_name='gpt-4-mini-2024-07-18', temperature=1)
embeddings = OpenAIEmbeddings(api_key=api_key)

# Configurar herramientas propias o de terceros
search = TavilySearchResults(max_results=2, api_key=api_key_tavily)
tools = [search]

# Configurar creencias
beliefs = [
    Belief(
        content="Eres un asistente de chat",
        type=BeliefType.PERSONALITY,
        confidence=1,
        source='personality'
    )
]

# Crear instancia del bot
bot = LangChainBot(language_model, embeddings, beliefs=beliefs, tools=tools)

# Obtener respuesta
response = bot.get_response("Hola como te llamas?")

Ejemplo de Clasificación de Texto

from sonika_langchain_bot.langchain_clasificator import OpenAIModel, TextClassifier
from pydantic import BaseModel, Field

# Definir estructura de clasificación
class Classification(BaseModel):
    intention: str = Field()
    sentiment: str = Field(..., enum=["feliz", "neutral", "triste", "excitado"])
    aggressiveness: int = Field(
        ...,
        description="describes how aggressive the statement is",
        enum=[1, 2, 3, 4, 5],
    )
    language: str = Field(
        ..., enum=["español", "ingles", "frances", "aleman", "italiano"]
    )

# Inicializar clasificador
model = OpenAIModel(api_key=api_key, validation_class=Classification)
classifier = TextClassifier(api_key=api_key, llm=model, validation_class=Classification)

# Clasificar texto
result = classifier.classify("Tu texto aquí")

Contribución

Las contribuciones son bienvenidas. Por favor, abre un issue para discutir los cambios importantes que te gustaría hacer.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sonika_langchain_bot-0.0.6.tar.gz (11.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sonika_langchain_bot-0.0.6-py3-none-any.whl (12.2 kB view details)

Uploaded Python 3

File details

Details for the file sonika_langchain_bot-0.0.6.tar.gz.

File metadata

  • Download URL: sonika_langchain_bot-0.0.6.tar.gz
  • Upload date:
  • Size: 11.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.2

File hashes

Hashes for sonika_langchain_bot-0.0.6.tar.gz
Algorithm Hash digest
SHA256 f2a32ef0751f999b6e20f0db75302231a47df67ee1a437065e7d136120ab8f10
MD5 3a9230de6b6e0e0212b6743530fc4ce0
BLAKE2b-256 957b5e887ee6734174477fb9ec0e2c66e0d6a78da31a8f8519c0f1bd7468ae5f

See more details on using hashes here.

File details

Details for the file sonika_langchain_bot-0.0.6-py3-none-any.whl.

File metadata

File hashes

Hashes for sonika_langchain_bot-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 5c9f48d93679bdba44a4bc4012df4d4bf0341097dadf429859f588ec1001f35a
MD5 cb46262598cc2e44a91b2bbc820b9af1
BLAKE2b-256 1360c4d69460614328828af62a22f5771caaad87ffe73c28a3dfb1bcdd415e07

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page