Website (Includes Demo) | Documentation | Discord | Paper (coming soon!)
DocETL is a tool for creating and executing data processing pipelines, especially suited for complex document processing tasks. It offers a low-code, declarative YAML interface to define LLM-powered operations on complex data.
DocETL is the ideal choice when you're looking to maximize correctness and output quality for complex tasks over a collection of documents or unstructured datasets. You should consider using DocETL if:
- You want to perform semantic processing on a collection of data
- You have complex tasks that you want to represent via map-reduce (e.g., map over your documents, then group by the result of your map call & reduce)
- You're unsure how to best express your task to maximize LLM accuracy
- You're working with long documents that don't fit into a single prompt or are too lengthy for effective LLM reasoning
- You have validation criteria and want tasks to automatically retry when the validation fails
See the documentation for installing from PyPI.
Before installing DocETL, ensure you have Python 3.10 or later installed on your system. You can check your Python version by running:
python --version
- Clone the DocETL repository:
git clone https://github.com/shreyashankar/docetl.git
cd docetl
- Install Poetry (if not already installed):
pip install poetry
- Install the project dependencies:
poetry install
- Set up your OpenAI API key:
Create a .env file in the project root and add your OpenAI API key:
OPENAI_API_KEY=your_api_key_here
Alternatively, you can set the OPENAI_API_KEY environment variable in your shell.
- Run the basic test suite to ensure everything is working (this costs less than $0.01 with OpenAI):
make tests-basic
That's it! You've successfully installed DocETL and are ready to start processing documents.
For more detailed information on usage and configuration, please refer to our documentation.