Skip to main content

An integration package connecting Maritaca AI and LangChain for Brazilian Portuguese language models

Project description

langchain-maritaca

PyPI version Python Downloads License: MIT CI codecov

🇧🇷 Leia em Português

An integration package connecting Maritaca AI and LangChain for Brazilian Portuguese language models.

Author: Anderson Henrique da Silva Location: Minas Gerais, Brasil GitHub: anderson-ufrj

Overview

Maritaca AI provides state-of-the-art Brazilian Portuguese language models, including the Sabiá family of models. This integration allows you to use Maritaca's models seamlessly within the LangChain ecosystem.

Available Models

Model Description Pricing (per 1M tokens)
sabia-3.1.1 Most capable model, best for complex tasks Check Maritaca AI for pricing
sabiazinho-3.1 Fast and economical, great for simple tasks Check Maritaca AI for pricing

Installation

pip install langchain-maritaca

Setup

Set your Maritaca API key as an environment variable:

export MARITACA_API_KEY="your-api-key"

Or pass it directly to the model:

from langchain_maritaca import ChatMaritaca

model = ChatMaritaca(api_key="your-api-key")

Usage

Basic Usage

from langchain_maritaca import ChatMaritaca

model = ChatMaritaca(
    model="sabia-3.1",
    temperature=0.7,
)

messages = [
    ("system", "Você é um assistente prestativo especializado em cultura brasileira."),
    ("human", "Quais são as principais festas populares do Brasil?"),
]

response = model.invoke(messages)
print(response.content)

Streaming

from langchain_maritaca import ChatMaritaca

model = ChatMaritaca(model="sabia-3.1", streaming=True)

for chunk in model.stream("Conte uma história sobre o folclore brasileiro"):
    print(chunk.content, end="", flush=True)

Async Usage

import asyncio
from langchain_maritaca import ChatMaritaca

async def main():
    model = ChatMaritaca(model="sabia-3.1")
    response = await model.ainvoke("Qual é a receita de pão de queijo?")
    print(response.content)

asyncio.run(main())

With LangChain Expression Language (LCEL)

from langchain_maritaca import ChatMaritaca
from langchain_core.prompts import ChatPromptTemplate

model = ChatMaritaca(model="sabia-3.1")

prompt = ChatPromptTemplate.from_messages([
    ("system", "Você é um especialista em {topic}."),
    ("human", "{question}"),
])

chain = prompt | model

response = chain.invoke({
    "topic": "história do Brasil",
    "question": "Quem foi Tiradentes?"
})
print(response.content)

With Tool Calling (Function Calling)

from langchain_maritaca import ChatMaritaca
from langchain_core.tools import tool

@tool
def get_weather(city: str) -> str:
    """Get the current weather for a city."""
    return f"O clima em {city} está ensolarado, 25°C"

model = ChatMaritaca(model="sabia-3.1")
model_with_tools = model.bind_tools([get_weather])

response = model_with_tools.invoke("Como está o tempo em São Paulo?")
print(response)

Why Maritaca AI?

Maritaca AI models are specifically trained for Brazilian Portuguese, offering:

  • Native Portuguese Understanding: Better comprehension of Brazilian idioms, expressions, and cultural context
  • Local Data Training: Trained on diverse Brazilian Portuguese data sources
  • Cost-Effective: Competitive pricing for Portuguese language tasks
  • Low Latency: Servers located in Brazil for faster response times

Used in Production

Cidadão.AI - Brazilian government transparency platform powered by AI agents, handling 331K+ requests/month.

Using this package in production? Open an issue to get featured!

API Reference

ChatMaritaca

Main class for interacting with Maritaca AI models.

Parameters:

Parameter Type Default Description
model str "sabia-3.1" Model name to use
temperature float 0.7 Sampling temperature (0.0-2.0)
max_tokens int None Maximum tokens to generate
top_p float 0.9 Top-p sampling parameter
api_key str None Maritaca API key (or use env var)
base_url str "https://chat.maritaca.ai/api" API base URL
timeout float 60.0 Request timeout in seconds
max_retries int 2 Maximum retry attempts
retry_if_rate_limited bool True Auto-retry on rate limit (HTTP 429)
retry_delay float 1.0 Initial delay between retries (seconds)
retry_max_delay float 60.0 Maximum delay between retries (seconds)
retry_multiplier float 2.0 Multiplier for exponential backoff
streaming bool False Enable streaming responses

Development

Setup

# Clone the repository
git clone https://github.com/anderson-ufrj/langchain-maritaca.git
cd langchain-maritaca

# Install dependencies
pip install -e ".[dev]"

# Run tests
pytest

# Run linting
ruff check .
ruff format .

# Run type checking
mypy langchain_maritaca

Running Tests

# Unit tests only
pytest tests/unit_tests/

# Integration tests (requires MARITACA_API_KEY)
pytest tests/integration_tests/

# With coverage
pytest --cov=langchain_maritaca --cov-report=html

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'feat: add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Changelog

See CHANGELOG.md for a list of changes.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Related Projects

  • LangChain - Building applications with LLMs through composability
  • Maritaca AI - Brazilian Portuguese language models

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_maritaca-0.2.3.tar.gz (70.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_maritaca-0.2.3-py3-none-any.whl (15.7 kB view details)

Uploaded Python 3

File details

Details for the file langchain_maritaca-0.2.3.tar.gz.

File metadata

  • Download URL: langchain_maritaca-0.2.3.tar.gz
  • Upload date:
  • Size: 70.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for langchain_maritaca-0.2.3.tar.gz
Algorithm Hash digest
SHA256 76c47920120d9d86447f745353ad7ed4269a5cbfe0b7b77d09749c1605d1df18
MD5 2351700a24bf4065aef44567d8d34862
BLAKE2b-256 821c350baddfc07c8d9f9cbe59eb7d9ec56f69003b2006e0fdaca71d1e40e706

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_maritaca-0.2.3.tar.gz:

Publisher: publish.yml on anderson-ufrj/langchain-maritaca

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file langchain_maritaca-0.2.3-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_maritaca-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 be78977cf457b3a6c55c7b8f5fcb63ca1672a2b271f0091da86b1aad1e1210de
MD5 51f26c36f8e71b8042e3d718fa022b5f
BLAKE2b-256 2ac39170aef0f2b29c82754724316eea75b504d75f40e86bde9256b92e9bf0cf

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_maritaca-0.2.3-py3-none-any.whl:

Publisher: publish.yml on anderson-ufrj/langchain-maritaca

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page