Modular Python tool for profiling files, analyzing directory structures, and inspecting image data
Project description
Fast, multi-backend file/directory profiling and data preparation.
pip install filoma
Installation • Documentation • Agentic Analysis • Interactive CLI • Quickstart • Cookbook • Roboflow Demo • Source Code
📖 New to Filoma? Check out the Cookbook for practical, copy-paste recipes for common tasks!
filoma helps you analyze file directory trees, inspect file metadata, and prepare your data for exploration. It can achieve this blazingly fast using the best available backend (Rust, fd, or pure Python) ⚡🍃
Key Features
- 🚀 High-Performance Backends: Automatic selection of Rust,
fd, or Python for the best performance. - 📈 DataFrame Integration: Convert scan results to Polars (or pandas) DataFrames for powerful analysis.
- 📊 Rich Directory Analysis: Get detailed statistics on file counts, extensions, sizes, and more.
- 🔍 Smart File Search: Use regex and glob patterns to find files with
FdFinder. - 🖼️ File/Image Profiling: Extract metadata and statistics from various file formats.
- 🛡️ Dataset Integrity & Quality: Unified integrity checking for snapshots, manifests, and automated quality scans (corruption, duplicates, leakage, class balance). 📖 Data Integrity Guide →
- 🧠 Agentic Analysis: Natural language interface for file discovery, deduplication, and metadata inspection. 📖 Brain Guide →
- 🖥️ Interactive CLI: Beautiful terminal interface for filesystem exploration and DataFrame analysis. 📖 CLI Documentation →
- 🌐 MCP Server: Expose all 21 filesystem tools to any MCP-compatible AI assistant (Claude Desktop, Cline, Cursor, etc.). 📖 MCP Configuration →
🎯 Local AI in 10 seconds:
curl -sL https://raw.githubusercontent.com/kalfasyan/filoma/main/scripts/install.sh | sh→ Use with Goose + Ollama for fully local filesystem analysis. Learn more →
⚡ Quick Start
filoma provides a unified API for filesystem analysis.
End-to-End Example: Folder → DataFrame → Insights
This is the core Filoma workflow in one place: scan a folder, build a rich dataframe, filter it, and extract quick insights.
import filoma as flm
dataset = "notebooks/Weeds-3"
# 1) Fast scan + high-level summary
analysis = flm.probe(dataset)
analysis.print_summary()
# 2) Build an enriched dataframe (paths, extension, sizes, ownership, timestamps, etc.)
df = flm.probe_to_df(dataset, enrich=True)
# 3) Narrow to image files and inspect distribution
images = df.filter_by_extension(["jpg", "png"])
print(images.extension_counts())
print(images.directory_counts().head(3))
# 4) Get the largest files quickly
largest = images.sort("size_bytes", descending=True).head(5)
print(largest.select(["path", "size_bytes"]))
This flow is typically the fastest way to move from raw folder structure to actionable dataset insight.
1. File & Image Profiling
Extract rich metadata and statistics from any file or image.
import filoma as flm
# Profile any file
info = flm.probe_file("README.md")
print(info)
📄 See Metadata Output
Filo(
path=PosixPath('README.md'),
size=12237,
mode_str='-rw-rw-r--',
owner='user',
modified=datetime.datetime(2025, 12, 30, 22, 45, 53),
is_file=True,
...
)
For images, probe_image automatically extracts shapes, types, and pixel statistics.
2. Directory Analysis
Scan entire directory trees in milliseconds. filoma automatically picks the fastest available backend (Rust → fd → Python).
# Analyze a directory
analysis = flm.probe('.')
# Print high-level summary
analysis.print_summary()
📂 See Directory Summary Table
Directory Analysis: /project (🦀 Rust (Parallel)) - 0.60s
┏━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━┓
┃ Metric ┃ Value ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━┩
│ Total Files │ 57,225 │
│ Total Folders │ 3,427 │
│ Total Size │ 2,084.90 MB │
│ Average Files per Folder │ 16.70 │
│ Maximum Depth │ 14 │
│ Empty Folders │ 103 │
│ Analysis Time │ 0.60s │
│ Processing Speed │ 102,114 items/sec │
└──────────────────────────┴──────────────────────┘
# Or get a detailed report with extensions and folder stats
analysis.print_report()
📊 See Detailed Directory Report
File Extensions
┏━━━━━━━━━━━━┳━━━━━━━━┳━━━━━━━━━━━━┓
┃ Extension ┃ Count ┃ Percentage ┃
┡━━━━━━━━━━━━╇━━━━━━━━╇━━━━━━━━━━━━┩
│ .py │ 240 │ 12.8% │
│ .jpg │ 1,204 │ 64.2% │
│ .json │ 431 │ 23.0% │
│ .svg │ 28,674 │ 50.1% │
└────────────┴────────┴────────────┘
Common Folder Names
┏━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━┓
┃ Folder Name ┃ Occurrences ┃
┡━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━┩
│ src │ 1 │
│ tests │ 1 │
│ docs │ 1 │
│ notebooks │ 1 │
└───────────────┴─────────────┘
Empty Folders (3 found)
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Path ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩
│ /project/data/raw/empty_set_A │
│ /project/logs/old/unused │
│ /project/temp/scratch │
└────────────────────────────────────────────┘
3. DataFrame Analysis
Convert scan results to Polars DataFrames for advanced analysis.
# Scan and get an enriched filoma.DataFrame (Polars)
df = flm.probe_to_df('src', enrich=True)
# Perform operations
df.filter_by_extension([".py", ".rs"])
df.directory_counts()
📊 See Enriched DataFrame Output
filoma.DataFrame with 2 rows
shape: (2, 18)
┌───────────────────┬───────┬────────┬───────────────┬───┬─────────┬───────┬────────┬────────┐
│ path ┆ depth ┆ parent ┆ name ┆ … ┆ inode ┆ nlink ┆ sha256 ┆ xattrs │
│ --- ┆ --- ┆ --- ┆ --- ┆ ┆ --- ┆ --- ┆ --- ┆ --- │
│ str ┆ i64 ┆ str ┆ str ┆ ┆ i64 ┆ i64 ┆ str ┆ str │
╞═══════════════════╪═══════╪════════╪═══════════════╪═══╪═════════╪═══════╪════════╪════════╡
│ src/async_scan.rs ┆ 1 ┆ src ┆ async_scan.rs ┆ … ┆ 7601121 ┆ 1 ┆ null ┆ {} │
│ src/filoma ┆ 1 ┆ src ┆ filoma ┆ … ┆ 7603126 ┆ 8 ┆ null ┆ {} │
└───────────────────┴───────┴────────┴───────────────┴───┴─────────┴───────┴────────┴────────┘
✨ Enriched columns added: parent, name, stem, suffix, size_bytes, modified_time,
created_time, is_file, is_dir, owner, group, mode_str, inode, nlink, sha256, xattrs, depth
- Seamless Pandas Integration: Just use
df.pandasfor instant conversion. - Lazy Loading:
import filomais cheap; heavy dependencies load only when needed.
4. Specialized DataFrame Operations
Filoma's DataFrame extends Polars with filesystem-specific operations for quick filtering and summarization.
# Filter by extensions
df.filter_by_extension([".py", ".rs"])
# Quick frequency analysis
df.extension_counts()
df.directory_counts()
🔍 See Operation Examples
filter_by_extension([".py", ".rs"])
shape: (3, 1)
┌─────────────────────┐
│ path │
│ --- │
│ str │
╞═════════════════════╡
│ src/async_scan.rs │
│ src/lib.rs │
│ src/filoma/dedup.py │
└─────────────────────┘
extension_counts() — groups files by extension and returns counts.
shape: (3, 2)
┌────────────┬─────┐
│ extension ┆ len │
│ --- ┆ --- │
│ str ┆ u32 │
╞════════════╪═════╡
│ .py ┆ 240 │
│ .jpg ┆ 124 │
│ .json ┆ 43 │
└────────────┴─────┘
directory_counts() — summarizes file distribution across parent directories.
shape: (3, 2)
┌────────────┬─────┐
│ parent_dir ┆ len │
│ --- ┆ --- │
│ str ┆ u32 │
╞════════════╪═════╡
│ src/filoma ┆ 12 │
│ tests ┆ 8 │
│ docs ┆ 5 │
└────────────┴─────┘
🗂️ Advanced Topics
Dataset Convenience Class
Use the Dataset class for orchestration of snapshotting, profiling, integrity checks, and AI interactions:
import filoma as flm
ds = flm.Dataset("./my_data")
# Snapshot, Quality Scan, and Deduplication
ds.snap(mode="deep")
ds.run_quality_scan()
ds.dedup()
# Get an enriched DataFrame of the dataset
df = ds.to_dataframe()
print(df.extension_counts())
# Agentic interaction with this specific dataset
ds.get_brain().run("Is there any class imbalance in my dataset?")
Dataset Integrity & Quality
Filoma provides a comprehensive suite for dataset validation (corruption, leaks, balance) and manifest integrity:
from filoma.core.verifier import DatasetVerifier
verifier = DatasetVerifier("./data")
verifier.run_all()
verifier.print_summary()
Deduplication
Find duplicate files, images (perceptual hash), or text files.
# Standard find
filoma dedup /path/to/dataset
# Cross-directory find
filoma dedup train/ valid/ --cross-dir
Agentic Analysis
Connect a "brain" to your filesystem for natural language interaction:
from filoma.brain import get_agent
agent = get_agent()
await agent.run("Create a dataframe from notebooks/Weeds-3 with enrichment")
await agent.run("Filter by extension: jpg, png")
await agent.run("Summarize dataframe and show top directories")
await agent.run("Sort dataframe by size descending and show top 5")
Or use the interactive chat CLI:
filoma brain chat
# Then ask:
# - create a dataframe from notebooks/Weeds-3
# - filter by extension jpg,png
# - summarize dataframe
# - export dataframe to weeds_images.csv
Advanced Workflow Orchestration
Filoma Brain now includes advanced orchestrator tools for enterprise-grade dataset analysis:
# Run advanced workflow examples
make brain-advanced
# Or in code:
await agent.run("Run a corrupted file audit on /path/to/dataset")
await agent.run("Generate a dataset hygiene report for /path/to/dataset")
await agent.run("Assess the migration readiness of /path/to/dataset")
These tools provide structured, deterministic reports with detailed findings, recommendations, and confidence scores.
Interactive CLI
filoma brain chat
📊 Performance & Benchmarks
Need to compare backend performance? Check out the comprehensive Benchmarks Guide!
Local SSD (1M files):
- 🦀 Rust: 7.3s (136K files/sec)
- ⚡ Async: 11.5s (87K files/sec)
- 🐍 Python: 35.5s (28K files/sec)
Network Storage (200K files, cold cache):
- 🦀 Rust: 2.3s (86K files/sec)
- ⚡ Async: 2.8s (70K files/sec)
- 🐍 Python: 15.1s (13K files/sec)
python benchmarks/benchmark.py --path /your/directory -n 3 --backend profiling
License
This work is licensed under a Creative Commons Attribution 4.0 International License.
Contributing
Contributions welcome! Please check the issues for planned features and bug reports.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file filoma-1.11.16.tar.gz.
File metadata
- Download URL: filoma-1.11.16.tar.gz
- Upload date:
- Size: 600.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d58c64402e53fefb57d9d4fe6534188dcf5e1a15fcbee2a79dd7102a14a32f50
|
|
| MD5 |
04e478a9e2b336600980a3bb689aa763
|
|
| BLAKE2b-256 |
929e7fc58cacb2f93293f5a028430b26dacc5b822e74f177154850bb4a41b51d
|
Provenance
The following attestation bundles were made for filoma-1.11.16.tar.gz:
Publisher:
publish.yml on kalfasyan/filoma
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
filoma-1.11.16.tar.gz -
Subject digest:
d58c64402e53fefb57d9d4fe6534188dcf5e1a15fcbee2a79dd7102a14a32f50 - Sigstore transparency entry: 1239332238
- Sigstore integration time:
-
Permalink:
kalfasyan/filoma@fc5ca6eba62ffe451b7c3a1f99080ea58ef7fd6e -
Branch / Tag:
refs/tags/v1.11.16 - Owner: https://github.com/kalfasyan
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@fc5ca6eba62ffe451b7c3a1f99080ea58ef7fd6e -
Trigger Event:
push
-
Statement type:
File details
Details for the file filoma-1.11.16-cp311-cp311-win_amd64.whl.
File metadata
- Download URL: filoma-1.11.16-cp311-cp311-win_amd64.whl
- Upload date:
- Size: 487.7 kB
- Tags: CPython 3.11, Windows x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
adde1e8f2e2b0c38765c81b442149c64dabbc2a7b68253494e6ceb8086a000d4
|
|
| MD5 |
6d0d3d650a60e02b6783f9e4e7ad934e
|
|
| BLAKE2b-256 |
d3e5b44a93ebf87f17335d1916049f35c9e4dde0f1e55ecf1e8ed0af95adf7c4
|
Provenance
The following attestation bundles were made for filoma-1.11.16-cp311-cp311-win_amd64.whl:
Publisher:
publish.yml on kalfasyan/filoma
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
filoma-1.11.16-cp311-cp311-win_amd64.whl -
Subject digest:
adde1e8f2e2b0c38765c81b442149c64dabbc2a7b68253494e6ceb8086a000d4 - Sigstore transparency entry: 1239332239
- Sigstore integration time:
-
Permalink:
kalfasyan/filoma@fc5ca6eba62ffe451b7c3a1f99080ea58ef7fd6e -
Branch / Tag:
refs/tags/v1.11.16 - Owner: https://github.com/kalfasyan
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@fc5ca6eba62ffe451b7c3a1f99080ea58ef7fd6e -
Trigger Event:
push
-
Statement type:
File details
Details for the file filoma-1.11.16-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: filoma-1.11.16-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 664.4 kB
- Tags: CPython 3.11, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
030c5fefa17efbd49ebea3c00b8c49d97f630a5bb7df9342861fb8a08e4bc326
|
|
| MD5 |
b1096fdf648ea0afcd4fe825911a2888
|
|
| BLAKE2b-256 |
51031ef77c30d0747485ed366706f51798a9181cdea0930d5b67c5cd80edb3e6
|
Provenance
The following attestation bundles were made for filoma-1.11.16-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:
Publisher:
publish.yml on kalfasyan/filoma
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
filoma-1.11.16-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl -
Subject digest:
030c5fefa17efbd49ebea3c00b8c49d97f630a5bb7df9342861fb8a08e4bc326 - Sigstore transparency entry: 1239332247
- Sigstore integration time:
-
Permalink:
kalfasyan/filoma@fc5ca6eba62ffe451b7c3a1f99080ea58ef7fd6e -
Branch / Tag:
refs/tags/v1.11.16 - Owner: https://github.com/kalfasyan
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@fc5ca6eba62ffe451b7c3a1f99080ea58ef7fd6e -
Trigger Event:
push
-
Statement type:
File details
Details for the file filoma-1.11.16-cp311-cp311-macosx_11_0_arm64.whl.
File metadata
- Download URL: filoma-1.11.16-cp311-cp311-macosx_11_0_arm64.whl
- Upload date:
- Size: 607.7 kB
- Tags: CPython 3.11, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d328ab0cf72b8e4c8da1a0dc491b4e6c85549d35e3aaa42621e8aff7335ee647
|
|
| MD5 |
e7594951d78624edaf1f768d14abb75f
|
|
| BLAKE2b-256 |
77095ff910e3db9baa7c33d9952db785f8b64a86a18773ab99809d0adf3599ca
|
Provenance
The following attestation bundles were made for filoma-1.11.16-cp311-cp311-macosx_11_0_arm64.whl:
Publisher:
publish.yml on kalfasyan/filoma
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
filoma-1.11.16-cp311-cp311-macosx_11_0_arm64.whl -
Subject digest:
d328ab0cf72b8e4c8da1a0dc491b4e6c85549d35e3aaa42621e8aff7335ee647 - Sigstore transparency entry: 1239332242
- Sigstore integration time:
-
Permalink:
kalfasyan/filoma@fc5ca6eba62ffe451b7c3a1f99080ea58ef7fd6e -
Branch / Tag:
refs/tags/v1.11.16 - Owner: https://github.com/kalfasyan
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@fc5ca6eba62ffe451b7c3a1f99080ea58ef7fd6e -
Trigger Event:
push
-
Statement type: