Skip to main content

A high-performance text pattern matching library built with Rust

Project description

Voluta

A high-performance Python library for searching text patterns using the Aho-Corasick algorithm. Built with Rust for blazing fast processing.

Features

  • Memory-mapped file processing for optimal performance with large files
  • Parallel processing option for multi-core utilization
  • Configurable chunk sizes for memory management and performance tuning
  • Direct byte matching for maximum control and performance
  • Returns full match information (start and end positions)
  • Case insensitive matching
  • Support for overlapping pattern matches

Using in your project

pip install voluta

Usage

Basic usage

import voluta

# Create a TextMatcher with patterns to search for
# Case insensitivity and overlapping matching are enabled by default
matcher = voluta.TextMatcher(["error", "warning", "critical"])

# Match patterns in a file (line-by-line)
# Returns (line_num, start_pos, end_pos, pattern)
matches = matcher.match_file("path/to/large.log")
for line_num, start, end, pattern in matches:
    print(f"Found '{pattern}' on line {line_num}, positions {start}-{end}")

# Using memory-mapped matching (faster for large files)
# Returns (byte_offset, end_offset, pattern)
matches = matcher.match_file_memmap("path/to/large.log", None)  # use default chunk size
for start, end, pattern in matches:
    print(f"Found '{pattern}' at byte positions {start}-{end}")

# Using parallel memory-mapped matching (maximum performance)
matches = matcher.match_file_memmap_parallel("path/to/large.log", None, None)

Advanced usage

# Specify chunk size (in bytes)
chunk_size = 8 * 1024 * 1024  # 8MB
matches = matcher.match_file_memmap("path/to/large.log", chunk_size)

# Specify chunk size and number of threads
chunk_size = 4 * 1024 * 1024  # 4MB
n_threads = 8
matches = matcher.match_file_memmap_parallel("path/to/large.log", chunk_size, n_threads)

# Direct byte matching for maximum performance
with open("path/to/large.log", "rb") as f:
    content = f.read()  # Or load bytes from any source
    matches = matcher.match_bytes(content)
    for start, end, pattern in matches:
        print(f"Found '{pattern}' at positions {start}-{end}")

# Simple example of finding specific text patterns
text = "The fox jumped over the fence. The fox is quick."
matcher = voluta.TextMatcher(["fox", "jump", "quick"])
matches = matcher.match_bytes(text.encode())
for start, end, pattern in matches:
    context = text[max(0, start-5):min(len(text), end+5)]
    print(f"Found '{pattern}' at {start}-{end}: '...{context}...'")

# Finding overlapping patterns
text = "abcdefgh"
# Overlapping matches are enabled by default to find all possible matches
matcher = voluta.TextMatcher(["abcd", "bcde", "cdef"])
matches = matcher.match_bytes(text.encode())
for start, end, pattern in matches:
    print(f"Found '{pattern}' at {start}-{end}")
    
# Disable overlapping matches if needed
matcher = voluta.TextMatcher(["abcd", "bcde", "cdef"], overlapping=False)

# Case sensitivity control
text = "Hello WORLD"
# By default, case insensitivity is enabled
matcher = voluta.TextMatcher(["hello", "world"])  # Will match both Hello and WORLD
# Disable case insensitivity if needed
matcher = voluta.TextMatcher(["hello", "world"], case_insensitive=False)  # Will only match exact case

Installation

Prerequisites

  • Rust (latest stable)
  • Python 3.12
  • uv
  • just

Building from source

# Clone repository
git clone https://github.com/trustshield/voluta.git && cd voluta

# Setup environment
uv venv
source .venv/bin/activate
uv sync --dev

# Build
just build

# Test
just test

Installing the wheel

After building, you can install the wheel in another project:

# The wheel file will be in target/wheels/
pip install /path/to/voluta/target/wheels/voluta-*.whl

# Alternatively, install directly from GitHub
pip install git+https://github.com/trustshield/voluta.git

Performance

The memory-mapped approach is significantly faster than line-by-line processing, especially for large files. For optimal performance:

  • Use match_file_memmap_parallel for multi-core systems
  • For maximum control and performance, use match_bytes with pre-loaded content
  • Test different chunk sizes for your specific hardware (typically 4-16MB works well)
  • For files under 100MB, the performance difference may be less noticeable
  • Note that enabling overlapping matches may impact performance

Metrics

On a MacBook Pro M1 Pro with 16GB RAM:

% just stress 1 50 32 8
python tests/benchmark/stress.py --size 1 --patterns 50 --chunk 32 --threads 8
Generating 50 random search patterns...
Generating 1.0GB test file with 50 search patterns...
Progress: 100% complete
Created test file at /var/folders/65/6343wbc565jcmgj3mpvktl880000gp/T/tmpl0uwzhss.txt, size: 1.00GB
Inserted 1024247 pattern instances

Running stress test with 50 patterns:
  - File size: 1.00GB
  - Chunk size: 32MB
  - Threads: 8

Testing memory-mapped matching...
Memory-mapped matching: 1107062 matches in 4.59 seconds
Processing speed: 223.13MB/s

Testing parallel memory-mapped matching...
Parallel memory-mapped matching: 1107062 matches in 0.63 seconds
Processing speed: 1629.94MB/s

Parallel processing is 7.30x faster than single-threaded

Sample matches:
   'b37lBbWUl4u' found at byte positions 790320349-790320360
   'OsoI' found at byte positions 619636284-619636288
   'KGcWelcw6Awl7d4' found at byte positions 952973106-952973121
   'YlvzcXcF' found at byte positions 481316276-481316284
   'BvK' found at byte positions 909977231-909977234

Stress test completed successfully!

Cleaning up temporary test file: /var/folders/65/6343wbc565jcmgj3mpvktl880000gp/T/tmpl0uwzhss.txt

Thanks

This library is a wrapper of BurntSushi/aho-corasick.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

voluta-0.2.0-cp310-cp310-manylinux_2_34_x86_64.whl (501.8 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.34+ x86-64

voluta-0.2.0-cp310-cp310-manylinux_2_34_aarch64.whl (466.5 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.34+ ARM64

voluta-0.2.0-cp310-cp310-macosx_10_14_x86_64.whl (455.5 kB view details)

Uploaded CPython 3.10macOS 10.14+ x86-64

voluta-0.2.0-cp310-cp310-macosx_10_14_x86_64.macosx_11_0_arm64.macosx_10_14_universal2.whl (864.1 kB view details)

Uploaded CPython 3.10macOS 10.14+ universal2 (ARM64, x86-64)macOS 10.14+ x86-64macOS 11.0+ ARM64

File details

Details for the file voluta-0.2.0-cp310-cp310-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for voluta-0.2.0-cp310-cp310-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 27ccfe9b499b2b9b8f758150ee824d241802d01178942eaa83ce6f77db547e60
MD5 41897681725d81348127be78d9cf2101
BLAKE2b-256 b748fcf0d284ae03da058f393346c9471f2095a76673d1bbb9bc0229b2b7c681

See more details on using hashes here.

Provenance

The following attestation bundles were made for voluta-0.2.0-cp310-cp310-manylinux_2_34_x86_64.whl:

Publisher: publish-to-pypi.yml on trustshield/voluta

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file voluta-0.2.0-cp310-cp310-manylinux_2_34_aarch64.whl.

File metadata

File hashes

Hashes for voluta-0.2.0-cp310-cp310-manylinux_2_34_aarch64.whl
Algorithm Hash digest
SHA256 fb1d879ad91d967423df70bee5677c6ed995ef9494bb9f3dd4b15b8e2d965fa0
MD5 bdebb33defb3783336d00f986e95bf9c
BLAKE2b-256 a50cd9f4642df5fda7cfa8a56278c8277f3546bb14663b7a9379a36ce35a71d9

See more details on using hashes here.

Provenance

The following attestation bundles were made for voluta-0.2.0-cp310-cp310-manylinux_2_34_aarch64.whl:

Publisher: publish-to-pypi.yml on trustshield/voluta

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file voluta-0.2.0-cp310-cp310-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for voluta-0.2.0-cp310-cp310-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 b0513ae834763ad0462712f8593a7d142f0cf07bc929ebcc4ad8c3e3eabaf6f4
MD5 0dfa919a2fec7a016eebbccda1c95ab8
BLAKE2b-256 67e466c147828ae88bb2ae68cae985fdb8405cc1c8fb7b14c8b831565867834c

See more details on using hashes here.

Provenance

The following attestation bundles were made for voluta-0.2.0-cp310-cp310-macosx_10_14_x86_64.whl:

Publisher: publish-to-pypi.yml on trustshield/voluta

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file voluta-0.2.0-cp310-cp310-macosx_10_14_x86_64.macosx_11_0_arm64.macosx_10_14_universal2.whl.

File metadata

File hashes

Hashes for voluta-0.2.0-cp310-cp310-macosx_10_14_x86_64.macosx_11_0_arm64.macosx_10_14_universal2.whl
Algorithm Hash digest
SHA256 158eb13325566381b62d11a1a6f29fa25bb6c14ea3c2576b465f687518262b94
MD5 ef9fa249e8705ed963c0b7a803d36875
BLAKE2b-256 5f38d17f8d5c667cf21b4159bc5baae04663a6bef56b214b0fb1e8459765dbfe

See more details on using hashes here.

Provenance

The following attestation bundles were made for voluta-0.2.0-cp310-cp310-macosx_10_14_x86_64.macosx_11_0_arm64.macosx_10_14_universal2.whl:

Publisher: publish-to-pypi.yml on trustshield/voluta

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page