Skip to main content

Biblioteca para extração inteligente de documentos PDF com IA

Project description

DeepRead

Biblioteca Python para extracao inteligente de documentos PDF com IA

PyPI Python 3.9+ License: MIT CI Quality Seal


Caracteristicas

  • Autenticacao por Token - HMAC-SHA256 com timing-safe validation
  • Extracao Inteligente - Extrai informacoes de PDFs usando LLMs (OpenAI / Azure OpenAI)
  • OCR Automatico - Detecta e processa documentos escaneados (Azure AI Vision)
  • Structured Output - Respostas tipadas com Pydantic
  • Async + Sync - APIs sincrona e assincrona com batch processing
  • Resiliencia - Retry com backoff exponencial e circuit breaker
  • Cache - Cache LRU com TTL para evitar reprocessamento
  • Page Range - Filtre paginas especificas por posicao (inicio/fim)
  • Streaming - Modo lazy para economia de memoria
  • Tracking de Custos - Monitore tokens e custos por requisicao

Instalacao

pip install DeepRead.Monkai

Com OCR (Azure AI Vision):

pip install DeepRead.Monkai[ocr]

Desenvolvimento:

pip install DeepRead.Monkai[dev]

Uso Rapido

1. Obter Token de Acesso

O token de acesso e fornecido pela equipe Monkai. Para solicitar: contato@monkai.com.br

export DEEPREAD_API_TOKEN="dr_seu_token_fornecido_pela_monkai"
export OPENAI_API_KEY="sk-..."

2. Processar Documento

import os
from deepread import DeepRead, Question, QuestionConfig
from pydantic import BaseModel, Field

class ExtractionResponse(BaseModel):
    valor: str = Field(description="Valor extraido")
    unidade: str = Field(default="", description="Unidade de medida")
    confianca: float = Field(default=1.0, ge=0, le=1)

question = Question(
    config=QuestionConfig(id="quantidade", name="Extracao de Quantidade"),
    system_prompt="Voce e um especialista em extracao de dados de documentos.",
    user_prompt="Analise o texto e extraia a quantidade mencionada.\n\nTexto:\n{texto}",
    keywords=["quantidade", "litros", "volume", "total"],
    response_model=ExtractionResponse
)

dr = DeepRead(
    api_token=os.getenv("DEEPREAD_API_TOKEN"),
    openai_api_key=os.getenv("OPENAI_API_KEY"),
    model="gpt-5.1",
    verbose=True
)

dr.add_question(question)
result = dr.process("documento.pdf")

print(f"Resposta: {result.get_answer('quantidade')}")
print(f"Tokens: {result.total_metrics.tokens}")
print(f"Custo: ${result.total_metrics.cost_usd:.4f}")

3. Multiplas Perguntas com Page Range

from deepread import PageRange

dr.add_questions([
    Question(
        config=QuestionConfig(id="preco", name="Preco"),
        user_prompt="Extraia o preco: {texto}",
        keywords=["preco", "valor", "R$"],
        page_range=PageRange(start=1, end=5, from_position="start")
    ),
    Question(
        config=QuestionConfig(id="conclusao", name="Conclusao"),
        user_prompt="Extraia a conclusao: {texto}",
        keywords=["conclusao", "resultado"],
        page_range=PageRange(start=1, end=3, from_position="end")
    ),
])

result = dr.process("documento.pdf")
for r in result.results:
    print(f"{r.question_name}: {r.answer}")

4. Classificacao de Documentos

from deepread import Classification
from typing import Literal

class ClassificacaoDoc(BaseModel):
    classificacao: Literal["APROVADO", "REPROVADO", "REVISAR"]
    justificativa: str
    confianca: float = Field(ge=0, le=1)

classification = Classification(
    system_prompt="Voce e um classificador de documentos.",
    user_prompt="Baseado nos dados extraidos, classifique o documento:\n\n{dados}",
    response_model=ClassificacaoDoc
)

dr.set_classification(classification)
result = dr.process("documento.pdf", classify=True)
print(f"Classificacao: {result.classification}")

5. Processamento em Lote

from pathlib import Path

docs = list(Path("documentos/").glob("*.pdf"))
results = dr.process_batch(docs, classify=True, max_workers=4)

for r in results:
    print(f"{r.document.filename}: {r.get_answer('preco')}")

6. API Assincrona

import asyncio

async def main():
    dr = DeepRead(
        api_token=os.getenv("DEEPREAD_API_TOKEN"),
        openai_api_key=os.getenv("OPENAI_API_KEY"),
    )
    dr.add_question(question)

    result = await dr.process_async("documento.pdf")
    print(result.get_answer("quantidade"))

    results = await dr.process_batch_async(docs, max_concurrency=5)

asyncio.run(main())

7. Cache e Resiliencia

dr = DeepRead(
    api_token=os.getenv("DEEPREAD_API_TOKEN"),
    openai_api_key=os.getenv("OPENAI_API_KEY"),
    enable_cache=True,
    cache_ttl=3600,
    max_retries=3,
    circuit_breaker=True,
    circuit_breaker_threshold=5,
    circuit_breaker_timeout=60,
    streaming=True,
)

result = dr.process("documento.pdf")
print(f"Cache stats: {dr.cache_stats}")

8. Multiplos Tipos de Input

result = dr.process("documento.pdf")

result = dr.process("https://exemplo.com/doc.pdf")

with open("doc.pdf", "rb") as f:
    result = dr.process(f.read(), filename="doc.pdf")

import io
buffer = io.BytesIO(pdf_bytes)
result = dr.process(buffer, filename="doc.pdf")

CLI

CLI machine-friendly para automação e uso por agentes de IA. Toda saída é JSON em stdout, erros em stderr, e o exit code reflete sucesso (0) ou falha (≥1).

Instalação

pip install "DeepRead.Monkai[cli]"

Comandos

Comando Descrição
version Mostra a versão instalada
models Lista modelos LLM disponíveis com pricing
schemas Lista schemas pré-construídos com seus campos
extract <pdf> Extração one-shot de um único PDF
run <config> <target> Executa extração com YAML config (single ou batch)
init <output> Gera um arquivo YAML de configuração

Exemplos

# Versão
deepread version

# Listar modelos disponíveis
deepread models

# Listar schemas pré-construídos
deepread schemas

# Extração one-shot com schema pré-construído
export DEEPREAD_API_TOKEN=dr_...
export OPENAI_API_KEY=sk-...
deepread extract documento.pdf --schema DadosContrato

# Extração one-shot com prompt customizado
deepread extract documento.pdf \
  --prompt "Extraia o valor total do documento: {texto}" \
  --keywords "valor,total,R$" \
  --pages 1-5

# Gerar config YAML para uso recorrente
deepread init config.yaml --schema DadosContrato

# Executar com config (single)
deepread run config.yaml documento.pdf -o resultado.json

# Executar em lote (diretório)
deepread run config.yaml ./documentos/ -o resultados.csv -f csv

Flags do extract

Flag Env var Descrição
--token DEEPREAD_API_TOKEN Token do DeepRead
--api-key OPENAI_API_KEY Chave OpenAI
--schema / -s Nome de schema pré-construído
--prompt / -p Prompt customizado (deve conter {texto})
--model / -m Modelo: fast/balanced/complete/economic ou nome direto
--ocr OCR mode: off/auto/force
--keywords / -k Keywords separadas por vírgula
--pages Range de páginas (1-5 ou last-3)
--format / -f Saída: json/csv/text
--verbose / -v Logs detalhados em stderr
--quiet / -q Suprime stderr

Configuração via YAML

Para usar Azure OpenAI ou customizar parâmetros mais finos, rode deepread init para gerar um config base com seção auth que resolve ${ENV_VAR} automaticamente:

deepread:
  model: balanced
  ocr: off
  max_retries: 3
  enable_cache: false

auth:
  api_token: ${DEEPREAD_API_TOKEN}
  openai_api_key: ${OPENAI_API_KEY}
  # Para Azure OpenAI:
  # provider: azure
  # azure_api_key: ${AZURE_API_KEY}
  # azure_endpoint: https://seu-recurso.openai.azure.com
  # azure_deployment: gpt-4o
  # azure_api_version: 2024-02-15-preview

questions:
  - id: extracao
    name: Extração
    prompt: "Extraia as informações principais: {texto}"
    schema: DadosContrato

Exit codes

Code Significado
0 Sucesso
1 Falha genérica
2 Erro de configuração (chave/token ausente)
3 Arquivo não encontrado
4 Erro de execução (rede, API, etc.)

Azure OpenAI

export OPENAI_PROVIDER=azure
export AZURE_API_KEY="sua-chave-azure"
export AZURE_API_ENDPOINT="https://seu-recurso.openai.azure.com"
export AZURE_API_VERSION="2024-02-15-preview"
export AZURE_DEPLOYMENT_NAME="gpt-4o"
dr = DeepRead(
    api_token=os.getenv("DEEPREAD_API_TOKEN"),
    provider="azure",
    azure_api_key="sua-chave-azure",
    azure_endpoint="https://seu-recurso.openai.azure.com",
    azure_deployment="gpt-4o",
)
Parametro OpenAI Azure OpenAI
provider "openai" (default) "azure"
openai_api_key Obrigatorio Nao usado
azure_api_key Nao usado Obrigatorio
azure_endpoint Nao usado Obrigatorio
azure_deployment Nao usado Obrigatorio
model Nome do modelo Ignorado (usa deployment)

Modelos Disponiveis

print(DeepRead.available_models())
# {
#     "fast": "gpt-4.1",
#     "balanced": "gpt-5.1",
#     "complete": "gpt-5-2025-08-07",
#     "economic": "gpt-5-mini-2025-08-07"
# }

API Reference

DeepRead

Metodo Descricao
add_question(question) Adiciona uma pergunta
add_questions(questions) Adiciona multiplas perguntas
remove_question(id) Remove uma pergunta
clear_questions() Remove todas as perguntas
set_classification(config) Configura classificacao
process(document) Processa um documento (sync)
process_async(document) Processa um documento (async)
process_batch(documents, max_workers) Processa lote (sync, com ThreadPool)
process_batch_async(documents, max_concurrency) Processa lote (async, com Semaphore)
clear_cache() Limpa o cache
cache_stats Retorna hits/misses/size do cache
available_models() Lista modelos disponiveis
create_question(...) Factory method para Question

DeepRead Constructor

Parametro Tipo Default Descricao
api_token str - Token de autenticacao (obrigatorio)
openai_api_key str env Chave API OpenAI
model str gpt-5.1 Modelo LLM
verbose bool False Logs detalhados
max_retries int 3 Retries para erros transientes
enable_cache bool False Habilita cache LRU
cache_ttl int 3600 TTL do cache em segundos
streaming bool False Modo lazy (economia de memoria)
circuit_breaker bool False Habilita circuit breaker
circuit_breaker_threshold int 5 Falhas para abrir circuito
circuit_breaker_timeout int 60 Segundos para recovery
max_file_size_mb float 50 Limite de tamanho do arquivo
max_pages int 500 Limite de paginas
provider str openai Provider: openai ou azure

Question

Campo Tipo Descricao
config QuestionConfig Configuracao basica (id, name)
system_prompt str Prompt de sistema
user_prompt str Template do prompt (use {texto})
keywords list[str] Keywords para filtrar paginas
page_range PageRange Range de paginas (opcional)
response_model BaseModel Modelo Pydantic (opcional)

PageRange

Campo Tipo Descricao
start int Pagina inicial (1-indexed)
end int Pagina final (None = ate o fim)
from_position str "start" ou "end"

ProcessingResult

Campo Tipo Descricao
document DocumentMetadata Metadados do documento
results list[Result] Resultados por pergunta
classification dict Classificacao (se aplicavel)
total_metrics ProcessingMetrics Metricas totais

ProcessingMetrics

Campo Tipo Descricao
time_seconds float Tempo de processamento
tokens int Total de tokens
prompt_tokens int Tokens do prompt
completion_tokens int Tokens da resposta
cost_usd float Custo em USD
model str Modelo utilizado

Estrutura do Projeto

deepread/
├── __init__.py          # Exports principais
├── reader.py            # Classe DeepRead (sync + async)
├── config.py            # Modelos, precos, configuracoes
├── utils.py             # PDF loading, filtragem, metadata
├── ocr.py               # Azure AI Vision OCR
├── cache.py             # Cache LRU com TTL
├── resilience.py        # Retry + Circuit Breaker
├── exceptions.py        # Excecoes customizadas
├── auth/
│   ├── __init__.py
│   ├── token.py         # HMAC-SHA256 token validation
│   └── exceptions.py    # Excecoes de autenticacao
└── models/
    ├── __init__.py
    ├── question.py      # Question, QuestionConfig, PageRange
    ├── result.py        # Result, ProcessingResult, Metrics
    ├── classification.py # Classification
    └── schemas.py       # Schemas de exemplo (DadosContrato, etc)

Documentacao

Documento Descricao
Instalacao Guia de instalacao e configuracao
Guia Rapido Comece em 5 minutos
Autenticacao Sistema de tokens
Perguntas Configuracao de perguntas e extracao
Classificacao Classificacao de documentos
OCR Reconhecimento optico de caracteres
Schemas Modelos de dados e estruturas
API Reference Referencia completa da API
Exemplos Exemplos praticos (01-07)
Certificacao Certificado de qualidade

Certificacao de Qualidade

Este projeto foi auditado e certificado pelo Claude AI Quality Seal.

Dimensao Score
Seguranca 8.7/10
Usabilidade 8.2/10
Escalabilidade 7.8/10
Qualidade de Codigo 8.0/10
Global 8.18/10

Classificacao: PROFISSIONAL Serial: DR-CQA-DE8364E7-116B6022-D40E375D-42BB4E3B

Ver certificado completo | Ver certificado HTML


Suporte


Desenvolvido por Monkai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

deepread_monkai-2.5.1-cp313-cp313-win_amd64.whl (504.0 kB view details)

Uploaded CPython 3.13Windows x86-64

deepread_monkai-2.5.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64

deepread_monkai-2.5.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ ARM64

deepread_monkai-2.5.1-cp313-cp313-macosx_10_13_x86_64.whl (584.1 kB view details)

Uploaded CPython 3.13macOS 10.13+ x86-64

deepread_monkai-2.5.1-cp312-cp312-win_amd64.whl (506.4 kB view details)

Uploaded CPython 3.12Windows x86-64

deepread_monkai-2.5.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

deepread_monkai-2.5.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ ARM64

deepread_monkai-2.5.1-cp312-cp312-macosx_10_13_x86_64.whl (588.0 kB view details)

Uploaded CPython 3.12macOS 10.13+ x86-64

deepread_monkai-2.5.1-cp311-cp311-win_amd64.whl (524.8 kB view details)

Uploaded CPython 3.11Windows x86-64

deepread_monkai-2.5.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

deepread_monkai-2.5.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ ARM64

deepread_monkai-2.5.1-cp311-cp311-macosx_10_9_x86_64.whl (599.7 kB view details)

Uploaded CPython 3.11macOS 10.9+ x86-64

deepread_monkai-2.5.1-cp310-cp310-win_amd64.whl (522.5 kB view details)

Uploaded CPython 3.10Windows x86-64

deepread_monkai-2.5.1-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

deepread_monkai-2.5.1-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.5 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ ARM64

deepread_monkai-2.5.1-cp310-cp310-macosx_10_9_x86_64.whl (606.6 kB view details)

Uploaded CPython 3.10macOS 10.9+ x86-64

File details

Details for the file deepread_monkai-2.5.1-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 aea46540c1f9c674b6f7932527cf6260c3fc2e04412029af366e073da0fb366d
MD5 b9edff1f5ed886e31c331e57f746219a
BLAKE2b-256 be89925de2efc0470ca8c470bdc86543066916188c8486e1725780427f7b7c38

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 9a546f6012be638522d96160ab24371a7949952724d64367efcaf092876c8879
MD5 f87aca2f0fb50b0327eadb93528daf68
BLAKE2b-256 93bf02bcb4210dcb0745264587149086f69d64a1b5639ac5c37c5f1271c3d7ef

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 ba99e2ade2770658be947a3f2e32db0bf7936029fbbb536ac49b902fdf19f1d0
MD5 bd7f69fe232e6dc4a25ff0e08003c0a7
BLAKE2b-256 24aec73601f94bdb79c7a154e1b0f7ed9af0c96857572ebb0bc2bf0fb772d8f6

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp313-cp313-macosx_10_13_x86_64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp313-cp313-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 73ae1bb9c7529fade69a51131fc4a1b6b06b025a30eca3321e16be3b720fb41d
MD5 e4860a74fad84863a50cdf2df9ad35d3
BLAKE2b-256 9c52e19d5bc8ec8b7cc9eb776f6d4b1da04707cadad93022cc399e1c6b00d7e5

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 019f7eb732bfc6bf7d83f9d86a0549bc9b0af8631200efe3955f3d6063ad5535
MD5 e725f2632d9f44d61d4a68b4c86e0250
BLAKE2b-256 493dbd7dd134d5025a79e59a8d888fe4aaf71591adb685b0f680b55aef28f523

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 dad2b6db5265a6620b74beb599180b5f0869f22cc77bc6c2e4816db97bde6d5d
MD5 86f3ba26e4717e6d04cde820983bcee9
BLAKE2b-256 0adbf10fc4e1aef2fd73621c5c3558ebee545c3e0f3ddae5f35a1584fe95d391

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 01d6a24fc30f4e97bd17e6043023837c23a64058e4ee2a62174a44f563655c2a
MD5 8975d1721411bfb67d0b9ed4dfd0a809
BLAKE2b-256 07b69d43edc0116b8e08104216daf0ef068f2e7914d7d30d5d344142bc1bc189

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp312-cp312-macosx_10_13_x86_64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp312-cp312-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 c1ba16fe83cd63cca1bc87cd644ba62e8586f620ed2262aad0b5ede2bb89caa7
MD5 a88be8e044e482ffc3cbb16af26f9479
BLAKE2b-256 c7ed2762a619ab5cf3b9cec41044d84bf970f5f55a8b3d7b9c9f965d76903609

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 ecfaf59ad6b0ad81489f1d89066ce999dbad2374ed592b608f72cca24f667d7b
MD5 a0966111cfaf182d8505b1c2f8190b85
BLAKE2b-256 3cfe52138117bee9096970280c7c8275d87ba1a53a3e397568ab8339bf04141f

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 c24551958ab99132d74b615e94adb9c22b493b6e16e696c776afc5a307eb3475
MD5 013083a0ff4fa50ab4c1ddd3637fb42f
BLAKE2b-256 6e0eb2f7ac398b7a0e317eaa5fa4603a292c3fa76661d4d3dab0151a619e4aaa

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 e9bf25035400fb31875eaa537ab6be78547c641e4b731883d5f0ddd077e742ed
MD5 721324ec94036003f8562dc6fd5c78d7
BLAKE2b-256 d2f9638160f2fc3d6e31d31933ef91571ac1ca600306cfab0fc52a39c5ad1b5e

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 ffb0fa4f161b0e09a43481f3cf58465a2bf0c8950380ce828b66dd90b26cb8f3
MD5 2c41329d964fef056efb61e693d162d1
BLAKE2b-256 d6be22a8463ad5682fa356f35f342aa657b0533cd77580ee1a14f107d1bfa968

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 f78f24445d49c050525e5c8c79f710d964d5d5c3b13608da793a4902f09e2fbe
MD5 de56fdc07fa43afd818c7a4da5ac915b
BLAKE2b-256 9feafa3eec903ea215477fdd5529ede5883d13f8a420ea0d8e9239a905dc2774

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 806cf3b326fa52e0b309a69576bb8141eaee9303edd55ce30bae4b6b27a26616
MD5 e97856ce46be27ea87edf2e3c1e95949
BLAKE2b-256 1668f3d76a709596c2bf2fabc3eab2c4501de64b0e6db27c778ed249217a1f5e

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 ba8c502184cabb0eb4d486b2dd80fecd248d2f0e411775eadbc64a27682ea03a
MD5 9b2f96955018c355ad5bed00597730b5
BLAKE2b-256 24c35970e5ee13efc9298949dc0e5e2862afd8507fc5ecd6c671be8ee858ca4f

See more details on using hashes here.

File details

Details for the file deepread_monkai-2.5.1-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for deepread_monkai-2.5.1-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 791f4ad7ad2168130c596bc3cdf0a835979d9fa3fb97df42e59371677d22307d
MD5 861346220fbf3dcd31b5f4ba47c40b89
BLAKE2b-256 5011e6bcf57cfeff8e6c95def5a0260e3e313f6b1e5b390fce451ee74f55bf7a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page