Skip to main content

A library for performing inference using trained models.

Project description

Open-Source Pre-Processing Tools for Unstructured Data

The unstructured-inference repo contains hosted model inference code for layout parsing models. These models are invoked via API as part of the partitioning bricks in the unstructured package.

Installation

Package

Run pip install unstructured-inference.

Detectron2

Detectron2 is required for using models from the layoutparser model zoo but is not automatically installed with this package. For MacOS and Linux, build from source with:

pip install 'git+https://github.com/facebookresearch/detectron2.git@57bdb21249d5418c130d54e2ebdc94dda7a4c01a'

Other install options can be found in the Detectron2 installation guide.

Windows is not officially supported by Detectron2, but some users are able to install it anyway. See discussion here for tips on installing Detectron2 on Windows.

Repository

To install the repository for development, clone the repo and run make install to install dependencies. Run make help for a full list of install options.

Getting Started

To get started with the layout parsing model, use the following commands:

from unstructured_inference.inference.layout import DocumentLayout

layout = DocumentLayout.from_file("sample-docs/loremipsum.pdf")

print(layout.pages[0].elements)

Once the model has detected the layout and OCR'd the document, the text extracted from the first page of the sample document will be displayed. You can convert a given element to a dict by running the .to_dict() method.

Models

The inference pipeline operates by finding text elements in a document page using a detection model, then extracting the contents of the elements using direct extraction (if available), OCR, and optionally table inference models.

We offer several detection models including Detectron2 and YOLOX.

Using a non-default model

When doing inference, an alternate model can be used by passing the model object to the ingestion method via the model parameter. The get_model function can be used to construct one of our out-of-the-box models from a keyword, e.g.:

from unstructured_inference.models.base import get_model
from unstructured_inference.inference.layout import DocumentLayout

model = get_model("yolox")
layout = DocumentLayout.from_file("sample-docs/layout-parser-paper.pdf", detection_model=model)

Using models from the layoutparser model zoo

The UnstructuredDetectronModel class in unstructured_inference.modelts.detectron2 uses the faster_rcnn_R_50_FPN_3x model pretrained on DocLayNet, but by using different construction parameters, any model in the layoutparser model zoo can be used. UnstructuredDetectronModel is a light wrapper around the layoutparser Detectron2LayoutModel object, and accepts the same arguments. See layoutparser documentation for details.

Using your own model

Any detection model can be used for in the unstructured_inference pipeline by wrapping the model in the UnstructuredObjectDetectionModel class. To integrate with the DocumentLayout class, a subclass of UnstructuredObjectDetectionModel must have a predict method that accepts a PIL.Image.Image and returns a list of LayoutElements, and an initialize method, which loads the model and prepares it for inference.

Security Policy

See our security policy for information on how to report security vulnerabilities.

Learn more

Section Description
Unstructured Community Github Information about Unstructured.io community projects
Unstructured Github Unstructured.io open source repositories
Company Website Unstructured.io product and company info

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

unstructured_inference-0.8.1.tar.gz (44.6 kB view details)

Uploaded Source

Built Distribution

unstructured_inference-0.8.1-py3-none-any.whl (48.4 kB view details)

Uploaded Python 3

File details

Details for the file unstructured_inference-0.8.1.tar.gz.

File metadata

File hashes

Hashes for unstructured_inference-0.8.1.tar.gz
Algorithm Hash digest
SHA256 a73ffdc89a6e55315ad9700878a9c18faf845989cf065ca69216e0893051be8d
MD5 42514d09661db0efe3ee0875101cf830
BLAKE2b-256 badc273b0b4f325962ea9649d28414088d8eae177882586a638ff80a9846f14f

See more details on using hashes here.

File details

Details for the file unstructured_inference-0.8.1-py3-none-any.whl.

File metadata

File hashes

Hashes for unstructured_inference-0.8.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1f22fd25906ab8ecc7ea69c3aa9dcfb585ae51ba5d5770fc7c151b43851e9f9a
MD5 2d105f39eda26a86c869f7a3d0327376
BLAKE2b-256 d5f166af1f6feb219917ad10ac8815ce9c10441d5dc63ba4d660fc4f54dce37e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page