Skip to main content

A library for performing inference using trained models.

Project description

Open-Source Pre-Processing Tools for Unstructured Data

The unstructured-inference repo contains hosted model inference code for layout parsing models. These models are invoked via API as part of the partitioning bricks in the unstructured package.

Requires Python 3.12+.

Installation

Package

pip install unstructured-inference

Detectron2

Detectron2 is required for using models from the layoutparser model zoo but is not automatically installed with this package. For MacOS and Linux, build from source with:

pip install 'git+https://github.com/facebookresearch/detectron2.git@57bdb21249d5418c130d54e2ebdc94dda7a4c01a'

Other install options can be found in the Detectron2 installation guide.

Windows is not officially supported by Detectron2, but some users are able to install it anyway. See discussion here for tips on installing Detectron2 on Windows.

Development Setup

This project uses uv for dependency management.

# Clone and install all dependencies (including dev/test/lint groups)
git clone https://github.com/Unstructured-IO/unstructured-inference.git
cd unstructured-inference
make install

Run make help for a full list of available targets.

Getting Started

To get started with the layout parsing model, use the following commands:

from unstructured_inference.inference.layout import DocumentLayout

layout = DocumentLayout.from_file("sample-docs/loremipsum.pdf")

print(layout.pages[0].elements)

Once the model has detected the layout and OCR'd the document, the text extracted from the first page of the sample document will be displayed. You can convert a given element to a dict by running the .to_dict() method.

Models

The inference pipeline operates by finding text elements in a document page using a detection model, then extracting the contents of the elements using direct extraction (if available), OCR, and optionally table inference models.

We offer several detection models including Detectron2 and YOLOX.

Using a non-default model

When doing inference, an alternate model can be used by passing the model object to the ingestion method via the model parameter. The get_model function can be used to construct one of our out-of-the-box models from a keyword, e.g.:

from unstructured_inference.models.base import get_model
from unstructured_inference.inference.layout import DocumentLayout

model = get_model("yolox")
layout = DocumentLayout.from_file("sample-docs/layout-parser-paper.pdf", detection_model=model)

Using your own model

Any detection model can be used for in the unstructured_inference pipeline by wrapping the model in the UnstructuredObjectDetectionModel class. To integrate with the DocumentLayout class, a subclass of UnstructuredObjectDetectionModel must have a predict method that accepts a PIL.Image.Image and returns a list of LayoutElements, and an initialize method, which loads the model and prepares it for inference.

Security Policy

See our security policy for information on how to report security vulnerabilities.

Learn more

Section Description
Unstructured Community Github Information about Unstructured.io community projects
Unstructured Github Unstructured.io open source repositories
Company Website Unstructured.io product and company info

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

unstructured_inference-1.5.3.tar.gz (46.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

unstructured_inference-1.5.3-py3-none-any.whl (53.5 kB view details)

Uploaded Python 3

File details

Details for the file unstructured_inference-1.5.3.tar.gz.

File metadata

  • Download URL: unstructured_inference-1.5.3.tar.gz
  • Upload date:
  • Size: 46.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for unstructured_inference-1.5.3.tar.gz
Algorithm Hash digest
SHA256 8c6dab402ece74558edff8b9c117c88fa275925b587db01cd468a2702ba06bc8
MD5 8b5677daacae0c523dfb18069705df07
BLAKE2b-256 e3a017407743fe6e0b660eb97cdfdf122780562d911e47fdf11a70a1dc8bc84c

See more details on using hashes here.

Provenance

The following attestation bundles were made for unstructured_inference-1.5.3.tar.gz:

Publisher: release.yml on Unstructured-IO/unstructured-inference

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file unstructured_inference-1.5.3-py3-none-any.whl.

File metadata

File hashes

Hashes for unstructured_inference-1.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 91f22061790344877dbc53dbab3e4d2b593fa001f057610fb208736f45171f83
MD5 d3de921d82b4b5e1f2249d2ed193e1a6
BLAKE2b-256 9fac9f8e4907a1182bc131a4f588ca2627094b32c8135ae4492cf9a1c25d4718

See more details on using hashes here.

Provenance

The following attestation bundles were made for unstructured_inference-1.5.3-py3-none-any.whl:

Publisher: release.yml on Unstructured-IO/unstructured-inference

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page