Skip to main content

A library for performing inference using trained models.

Project description

Open-Source Pre-Processing Tools for Unstructured Data

The unstructured-inference repo contains hosted model inference code for layout parsing models. These models are invoked via API as part of the partitioning bricks in the unstructured package.

Installation

Package

Run pip install unstructured-inference.

Detectron2

Detectron2 is required for using models from the layoutparser model zoo but is not automatically installed with this package. For MacOS and Linux, build from source with:

pip install 'git+https://github.com/facebookresearch/detectron2.git@57bdb21249d5418c130d54e2ebdc94dda7a4c01a'

Other install options can be found in the Detectron2 installation guide.

Windows is not officially supported by Detectron2, but some users are able to install it anyway. See discussion here for tips on installing Detectron2 on Windows.

Repository

To install the repository for development, clone the repo and run make install to install dependencies. Run make help for a full list of install options.

Getting Started

To get started with the layout parsing model, use the following commands:

from unstructured_inference.inference.layout import DocumentLayout

layout = DocumentLayout.from_file("sample-docs/loremipsum.pdf")

print(layout.pages[0].elements)

Once the model has detected the layout and OCR'd the document, the text extracted from the first page of the sample document will be displayed. You can convert a given element to a dict by running the .to_dict() method.

Models

The inference pipeline operates by finding text elements in a document page using a detection model, then extracting the contents of the elements using direct extraction (if available), OCR, and optionally table inference models.

We offer several detection models including Detectron2 and YOLOX.

Using a non-default model

When doing inference, an alternate model can be used by passing the model object to the ingestion method via the model parameter. The get_model function can be used to construct one of our out-of-the-box models from a keyword, e.g.:

from unstructured_inference.models.base import get_model
from unstructured_inference.inference.layout import DocumentLayout

model = get_model("yolox")
layout = DocumentLayout.from_file("sample-docs/layout-parser-paper.pdf", detection_model=model)

Using your own model

Any detection model can be used for in the unstructured_inference pipeline by wrapping the model in the UnstructuredObjectDetectionModel class. To integrate with the DocumentLayout class, a subclass of UnstructuredObjectDetectionModel must have a predict method that accepts a PIL.Image.Image and returns a list of LayoutElements, and an initialize method, which loads the model and prepares it for inference.

Security Policy

See our security policy for information on how to report security vulnerabilities.

Learn more

Section Description
Unstructured Community Github Information about Unstructured.io community projects
Unstructured Github Unstructured.io open source repositories
Company Website Unstructured.io product and company info

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

unstructured_inference-1.2.0.tar.gz (45.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

unstructured_inference-1.2.0-py3-none-any.whl (49.4 kB view details)

Uploaded Python 3

File details

Details for the file unstructured_inference-1.2.0.tar.gz.

File metadata

  • Download URL: unstructured_inference-1.2.0.tar.gz
  • Upload date:
  • Size: 45.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for unstructured_inference-1.2.0.tar.gz
Algorithm Hash digest
SHA256 19ca28512f3649c70a759cf2a4e98663e942a1b83c1acdb9506b0445f4862f23
MD5 e064604060aacce96456792227f4b1f5
BLAKE2b-256 ce108f3bccfa9f1e0101a402ae1f529e07876541c6b18004747f0e793ed41f9e

See more details on using hashes here.

File details

Details for the file unstructured_inference-1.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for unstructured_inference-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 60a1635aa8e97a9e7daed1a129836f51c26588e0d2062c9cc6a5a17e6d40cb6a
MD5 54591afea7ab5b819ab9c5813f05790c
BLAKE2b-256 2d3b349cd091b590a6f1dbfebcb5fee0ea7b0b6ef6520df58794c9582567a24f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page