Skip to main content

ETL with LLM operations.

Project description

DocETL: Powering Complex Document Processing Pipelines

Website (Includes Demo) | Documentation | Discord | NotebookLM Podcast (thanks Shabie from our Discord community!) | Paper (coming soon!)

DocETL Figure

DocETL is a tool for creating and executing data processing pipelines, especially suited for complex document processing tasks. It offers a low-code, declarative YAML interface to define LLM-powered operations on complex data.

When to Use DocETL

DocETL is the ideal choice when you're looking to maximize correctness and output quality for complex tasks over a collection of documents or unstructured datasets. You should consider using DocETL if:

  • You want to perform semantic processing on a collection of data
  • You have complex tasks that you want to represent via map-reduce (e.g., map over your documents, then group by the result of your map call & reduce)
  • You're unsure how to best express your task to maximize LLM accuracy
  • You're working with long documents that don't fit into a single prompt or are too lengthy for effective LLM reasoning
  • You have validation criteria and want tasks to automatically retry when the validation fails

Installation

See the documentation for installing from PyPI.

Prerequisites

Before installing DocETL, ensure you have Python 3.10 or later installed on your system. You can check your Python version by running:

python --version

Installation Steps (from Source)

  1. Clone the DocETL repository:
git clone https://github.com/shreyashankar/docetl.git
cd docetl
  1. Install Poetry (if not already installed):
pip install poetry
  1. Install the project dependencies:
poetry install
  1. Set up your OpenAI API key:

Create a .env file in the project root and add your OpenAI API key:

OPENAI_API_KEY=your_api_key_here

Alternatively, you can set the OPENAI_API_KEY environment variable in your shell.

  1. Run the basic test suite to ensure everything is working (this costs less than $0.01 with OpenAI):
make tests-basic

That's it! You've successfully installed DocETL and are ready to start processing documents.

For more detailed information on usage and configuration, please refer to our documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

docetl-0.1.5.tar.gz (110.2 kB view details)

Uploaded Source

Built Distribution

docetl-0.1.5-py3-none-any.whl (127.5 kB view details)

Uploaded Python 3

File details

Details for the file docetl-0.1.5.tar.gz.

File metadata

  • Download URL: docetl-0.1.5.tar.gz
  • Upload date:
  • Size: 110.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.6

File hashes

Hashes for docetl-0.1.5.tar.gz
Algorithm Hash digest
SHA256 11b4ba70a06294cf9034efe8c11e7fb499ab4a55677a10bc8d90c05496993040
MD5 926bdd6614e09c995b2e13e31e151456
BLAKE2b-256 697df18f9a2b6c24f5faeebff584f0db942f689c5c45f6dd63153a6a024d36df

See more details on using hashes here.

Provenance

File details

Details for the file docetl-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: docetl-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 127.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.6

File hashes

Hashes for docetl-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 40cf004dbd97847fe52b68116ec8861b99c6ca57e1e96cebbc0fb6c8f539f05b
MD5 84dc7e41d53889ae3f538d301b172aab
BLAKE2b-256 2d8a783d73a260d8bf91e5249bfc49320e3ed1ca492747256aae40ad3bd3546a

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page