Skip to main content

Docling PDF conversion package

Project description

Docling

Docling

Docling bundles PDF document conversion to JSON and Markdown in an easy, self-contained package.

Features

  • ⚡ Converts any PDF document to JSON or Markdown format, stable and lightning fast
  • 📑 Understands detailed page layout, reading order and recovers table structures
  • 📝 Extracts metadata from the document, such as title, authors, references and language
  • 🔍 Optionally applies OCR (use with scanned PDFs)

Setup

You need Python 3.11 and poetry. Install poetry from here.

Once you have poetry installed, create an environment and install the package:

poetry env use $(which python3.11)
poetry shell
poetry install

Notes:

  • Works on macOS and Linux environments. Windows platforms are currently not tested.

Usage

For basic usage, see the convert.py example module. Run with:

python examples/convert.py

The output of the above command will be written to ./scratch.

Enable or disable pipeline features

You can control if table structure recognition or OCR should be performed by arguments passed to DocumentConverter

doc_converter = DocumentConverter(
    artifacts_path=artifacts_path,
    pipeline_options=PipelineOptions(do_table_structure=False, # Controls if table structure is recovered. 
                                     do_ocr=True), # Controls if OCR is applied (ignores programmatic content)
)

Impose limits on the document size

You can limit the file size and number of pages which should be allowed to process per document.

paths = [Path("./test/data/2206.01062.pdf")]

input = DocumentConversionInput.from_paths(
    paths, limits=DocumentLimits(max_num_pages=100, max_file_size=20971520)
)

Convert from binary PDF streams

You can convert PDFs from a binary stream instead of from the filesystem as follows:

buf = BytesIO(your_binary_stream)
docs = [DocumentStream(filename="my_doc.pdf", stream=buf)]
input = DocumentConversionInput.from_streams(docs)
converted_docs = doc_converter.convert(input)

Limit resource usage

You can limit the CPU threads used by docling by setting the environment variable OMP_NUM_THREADS accordingly. The default setting is using 4 CPU threads.

Contributing

Please read Contributing to Docling for details.

References

If you use Docling in your projects, please consider citing the following:

@software{Docling,
author = {Deep Search Team},
month = {7},
title = {{Docling}},
url = {https://github.com/DS4SD/docling},
version = {main},
year = {2024}
}

License

The Docling codebase is under MIT license. For individual model usage, please refer to the model licenses found in the original packages.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

docling-0.2.0.tar.gz (27.6 kB view details)

Uploaded Source

Built Distribution

docling-0.2.0-py3-none-any.whl (32.4 kB view details)

Uploaded Python 3

File details

Details for the file docling-0.2.0.tar.gz.

File metadata

  • Download URL: docling-0.2.0.tar.gz
  • Upload date:
  • Size: 27.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.10.12 Linux/6.5.0-1023-azure

File hashes

Hashes for docling-0.2.0.tar.gz
Algorithm Hash digest
SHA256 c8b228e5016aa518fe013bc6df3ca1550f448d652019b4797f5c83e53858cb76
MD5 1e4f9165b8d19fe7f751f31d30097c25
BLAKE2b-256 a8026347b2d3512cad263d2a09cc0970ceebdc236a1142a79a26e8d18d256b9b

See more details on using hashes here.

Provenance

File details

Details for the file docling-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: docling-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 32.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.10.12 Linux/6.5.0-1023-azure

File hashes

Hashes for docling-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6489ff2c0a3029482c74313e2f696354e29fb438a9110a9d6cea63cb14726122
MD5 adfd987db6aa2dfa47b393efcb679960
BLAKE2b-256 bb634b67b16f21d4235e6c87550a2c9c0c141363d55a6c8bb1b1fa6e5c99d590

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page