Data Lake pipelines for Vector DB, RAG & AI. Ingest, process, embed, and semantic search.
Project description
LakeFlow Backend
FastAPI backend and data pipelines for LakeFlow: ingest, staging, processing, embedding, and semantic search.
Overview
- API: FastAPI app (
lakeflow.main:app) — auth, search, embed, pipeline trigger, Qdrant proxy, system. - Data Lake: Layered zones under
LAKEFLOW_DATA_BASE_PATH:000_inbox→100_raw→200_staging→300_processed→400_embeddings→500_catalog. - Vector store: Qdrant (default collection
lakeflow_chunks). Embeddings via sentence-transformers (e.g.all-MiniLM-L6-v2).
Requirements
- Python ≥ 3.10
- Qdrant (e.g. Docker:
docker compose up -d qdrant) - See
requirements.txtfor Python dependencies
Install & run
With Docker (from the LakeFlow repo root where docker-compose.yml is):
docker compose up --build
# API: http://localhost:8011
Local dev (from repo root, go to lakeflow):
cd lakeflow
python3 -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -r requirements.txt
pip install -e .
# Create/copy .env (repo root or lakeflow) with LAKEFLOW_DATA_BASE_PATH, QDRANT_HOST, etc.
python -m uvicorn lakeflow.main:app --reload --port 8011
-
If you get
bad interpreter(venv points to wrong Python): remove.venv, runpython3 -m venv .venvagain thenpip install -r requirements.txtandpip install -e .. -
If you get
Address already in use(port 8011 in use): free the port then restart the server —lsof -ti :8011 | xargs kill -9 -
Swagger: http://localhost:8011/docs
-
ReDoc: http://localhost:8011/redoc
-
Embed API: docs/API_EMBED.md —
POST /search/embed
Pipeline steps (CLI)
Run from the lakeflow directory (with venv activated and LAKEFLOW_DATA_BASE_PATH set in .env or environment).
| Step | Command | Output |
|---|---|---|
| 0 – Inbox → Raw | python -m lakeflow.scripts.step0_inbox |
Hash, dedup, catalog |
| 1 – Staging | python -m lakeflow.scripts.step1_raw |
pdf_profile.json, validation.json |
| 2 – Processed | python -m lakeflow.scripts.step2_staging |
clean_text.txt, chunks.json, tables.json |
| 3 – Embeddings | python -m lakeflow.scripts.step3_processed_files |
embeddings.npy, chunks_meta.json |
| 4 – Qdrant | python -m lakeflow.scripts.step3_processed_qdrant |
Points in Qdrant |
Or use the Streamlit UI (Pipeline Runner) when LAKEFLOW_MODE=DEV.
Main APIs
- POST /auth/login – Demo login (e.g.
admin/admin123), returns JWT. - POST /search/embed – Body
{"text": "..."}→vector,embedding,dim. - POST /search/semantic – Body
{"query": "...", "top_k": 5, "qdrant_url": "...", "collection_name": "..."}. - POST /search/qa – RAG-style Q&A (semantic search + LLM). Optional.
- POST /pipeline/run – Run a pipeline step (auth required).
- GET/POST /qdrant/ – Qdrant collections and points (proxy).
Design notes
- Idempotent pipelines; deterministic UUIDs for Qdrant.
- SQLite without WAL (NAS-friendly).
- No full-file load for large files; streaming where applicable.
License
Same as the root repository.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file lake_flow_pipeline-0.1.1.tar.gz.
File metadata
- Download URL: lake_flow_pipeline-0.1.1.tar.gz
- Upload date:
- Size: 44.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
42d7de17f6770ab52e1fe691f6153b1d35584a0843440f91c21951fba3cf3f6a
|
|
| MD5 |
53435ddc960a8cf9736d33d72cca3ee5
|
|
| BLAKE2b-256 |
7702a760ac4a273c87a34b926787c32887d6450a56e5ca7e5facf5b7d33c21cd
|
File details
Details for the file lake_flow_pipeline-0.1.1-py3-none-any.whl.
File metadata
- Download URL: lake_flow_pipeline-0.1.1-py3-none-any.whl
- Upload date:
- Size: 65.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
98bda1f7ffd986754e03866d898b1c1a2118734daed2f8f3c4b891059d66d3a6
|
|
| MD5 |
8bd247cad2d2105f72049b41b27199dd
|
|
| BLAKE2b-256 |
96b0918fe1709c31b91c75034911af73cc844222eda4d8ae162dede385d20aad
|