Skip to main content

ETL with LLM operations.

Project description

DocETL: Powering Complex Document Processing Pipelines

Website (Includes Demo) | Documentation | Discord | NotebookLM Podcast (thanks Shabie from our Discord community!) | Paper (coming soon!)

DocETL Figure

DocETL is a tool for creating and executing data processing pipelines, especially suited for complex document processing tasks. It offers a low-code, declarative YAML interface to define LLM-powered operations on complex data.

When to Use DocETL

DocETL is the ideal choice when you're looking to maximize correctness and output quality for complex tasks over a collection of documents or unstructured datasets. You should consider using DocETL if:

  • You want to perform semantic processing on a collection of data
  • You have complex tasks that you want to represent via map-reduce (e.g., map over your documents, then group by the result of your map call & reduce)
  • You're unsure how to best express your task to maximize LLM accuracy
  • You're working with long documents that don't fit into a single prompt or are too lengthy for effective LLM reasoning
  • You have validation criteria and want tasks to automatically retry when the validation fails

Installation

See the documentation for installing from PyPI.

Prerequisites

Before installing DocETL, ensure you have Python 3.10 or later installed on your system. You can check your Python version by running:

python --version

Installation Steps (from Source)

  1. Clone the DocETL repository:
git clone https://github.com/ucbepic/docetl.git
cd docetl
  1. Install Poetry (if not already installed):
pip install poetry
  1. Install the project dependencies:
poetry install
  1. Set up your OpenAI API key:

Create a .env file in the project root and add your OpenAI API key:

OPENAI_API_KEY=your_api_key_here

Alternatively, you can set the OPENAI_API_KEY environment variable in your shell.

  1. Run the basic test suite to ensure everything is working (this costs less than $0.01 with OpenAI):
make tests-basic

That's it! You've successfully installed DocETL and are ready to start processing documents.

For more detailed information on usage and configuration, please refer to our documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

docetl-0.1.6.tar.gz (122.1 kB view details)

Uploaded Source

Built Distribution

docetl-0.1.6-py3-none-any.whl (141.3 kB view details)

Uploaded Python 3

File details

Details for the file docetl-0.1.6.tar.gz.

File metadata

  • Download URL: docetl-0.1.6.tar.gz
  • Upload date:
  • Size: 122.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for docetl-0.1.6.tar.gz
Algorithm Hash digest
SHA256 052ed368fc9cf5a0d7b27106a6de0e89ad64cf52f6f52c555e0f1d2d1d5fc08b
MD5 f0f4e22a08ac0965dfa752435884913a
BLAKE2b-256 22c1f991c6b5cd6464d7b7ecb69f8895380b880faf7680dba88802d6a2fca642

See more details on using hashes here.

Provenance

File details

Details for the file docetl-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: docetl-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 141.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for docetl-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 11f9c4c292c5fffb6dc7227d404219c3078c8617748aeb2fc51b44dd9c64a68c
MD5 0e3c176c1a8e64a699d3c28eda94378b
BLAKE2b-256 ad16faf4f2f4a361106115c4ba5ec9e07ba24f89d250007c091c98085b5ab0ee

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page