Skip to main content

An integration package connecting Maritaca AI and LangChain for Brazilian Portuguese language models

Project description

langchain-maritaca

PyPI version Python Downloads License: MIT CI

An integration package connecting Maritaca AI and LangChain for Brazilian Portuguese language models.

Author: Anderson Henrique da Silva Location: Minas Gerais, Brasil GitHub: anderson-ufrj

Overview

Maritaca AI provides state-of-the-art Brazilian Portuguese language models, including the Sabiá family of models. This integration allows you to use Maritaca's models seamlessly within the LangChain ecosystem.

Available Models

Model Description Pricing (per 1M tokens)
sabia-3.1.1 Most capable model, best for complex tasks Check Maritaca AI for pricing
sabiazinho-3.1 Fast and economical, great for simple tasks Check Maritaca AI for pricing

Installation

pip install langchain-maritaca

Setup

Set your Maritaca API key as an environment variable:

export MARITACA_API_KEY="your-api-key"

Or pass it directly to the model:

from langchain_maritaca import ChatMaritaca

model = ChatMaritaca(api_key="your-api-key")

Usage

Basic Usage

from langchain_maritaca import ChatMaritaca

model = ChatMaritaca(
    model="sabia-3.1",
    temperature=0.7,
)

messages = [
    ("system", "Você é um assistente prestativo especializado em cultura brasileira."),
    ("human", "Quais são as principais festas populares do Brasil?"),
]

response = model.invoke(messages)
print(response.content)

Streaming

from langchain_maritaca import ChatMaritaca

model = ChatMaritaca(model="sabia-3.1", streaming=True)

for chunk in model.stream("Conte uma história sobre o folclore brasileiro"):
    print(chunk.content, end="", flush=True)

Async Usage

import asyncio
from langchain_maritaca import ChatMaritaca

async def main():
    model = ChatMaritaca(model="sabia-3.1")
    response = await model.ainvoke("Qual é a receita de pão de queijo?")
    print(response.content)

asyncio.run(main())

With LangChain Expression Language (LCEL)

from langchain_maritaca import ChatMaritaca
from langchain_core.prompts import ChatPromptTemplate

model = ChatMaritaca(model="sabia-3.1")

prompt = ChatPromptTemplate.from_messages([
    ("system", "Você é um especialista em {topic}."),
    ("human", "{question}"),
])

chain = prompt | model

response = chain.invoke({
    "topic": "história do Brasil",
    "question": "Quem foi Tiradentes?"
})
print(response.content)

With Tool Calling (Function Calling)

from langchain_maritaca import ChatMaritaca
from langchain_core.tools import tool

@tool
def get_weather(city: str) -> str:
    """Get the current weather for a city."""
    return f"O clima em {city} está ensolarado, 25°C"

model = ChatMaritaca(model="sabia-3.1")
model_with_tools = model.bind_tools([get_weather])

response = model_with_tools.invoke("Como está o tempo em São Paulo?")
print(response)

Why Maritaca AI?

Maritaca AI models are specifically trained for Brazilian Portuguese, offering:

  • Native Portuguese Understanding: Better comprehension of Brazilian idioms, expressions, and cultural context
  • Local Data Training: Trained on diverse Brazilian Portuguese data sources
  • Cost-Effective: Competitive pricing for Portuguese language tasks
  • Low Latency: Servers located in Brazil for faster response times

Used in Production

Cidadão.AI - Brazilian government transparency platform powered by AI agents, handling 331K+ requests/month.

Using this package in production? Open an issue to get featured!

API Reference

ChatMaritaca

Main class for interacting with Maritaca AI models.

Parameters:

Parameter Type Default Description
model str "sabia-3.1" Model name to use
temperature float 0.7 Sampling temperature (0.0-2.0)
max_tokens int None Maximum tokens to generate
top_p float 0.9 Top-p sampling parameter
api_key str None Maritaca API key (or use env var)
base_url str "https://chat.maritaca.ai/api" API base URL
timeout float 60.0 Request timeout in seconds
max_retries int 2 Maximum retry attempts
streaming bool False Enable streaming responses

Development

Setup

# Clone the repository
git clone https://github.com/anderson-ufrj/langchain-maritaca.git
cd langchain-maritaca

# Install dependencies
pip install -e ".[dev]"

# Run tests
pytest

# Run linting
ruff check .
ruff format .

# Run type checking
mypy langchain_maritaca

Running Tests

# Unit tests only
pytest tests/unit_tests/

# Integration tests (requires MARITACA_API_KEY)
pytest tests/integration_tests/

# With coverage
pytest --cov=langchain_maritaca --cov-report=html

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'feat: add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Changelog

See CHANGELOG.md for a list of changes.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Related Projects

  • LangChain - Building applications with LLMs through composability
  • Maritaca AI - Brazilian Portuguese language models

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_maritaca-0.2.0.tar.gz (20.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_maritaca-0.2.0-py3-none-any.whl (11.5 kB view details)

Uploaded Python 3

File details

Details for the file langchain_maritaca-0.2.0.tar.gz.

File metadata

  • Download URL: langchain_maritaca-0.2.0.tar.gz
  • Upload date:
  • Size: 20.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for langchain_maritaca-0.2.0.tar.gz
Algorithm Hash digest
SHA256 a8047e0135fac0264655f239aee2c3c3d86ca1ba7cb360204daf3c688f66f2b2
MD5 643bfd9781f5d931765509dac261d127
BLAKE2b-256 24d40281a0216f46b3b39abdcee1c5c3d34d15920bf663c3a73a753503c9336d

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_maritaca-0.2.0.tar.gz:

Publisher: publish.yml on anderson-ufrj/langchain-maritaca

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file langchain_maritaca-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_maritaca-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 48555014aac91773a30646ba82af0b48456a7c861da81427b2ba4ca4a846a8ff
MD5 2994e4c5d22ea1452ce35dd86b3d5a1b
BLAKE2b-256 4b48a0b6ef64f82a355fed923fc8d19552abd525e06fc400e8e03fc67719c145

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_maritaca-0.2.0-py3-none-any.whl:

Publisher: publish.yml on anderson-ufrj/langchain-maritaca

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page