Automatically aligns an out-of-sync subtitle file to its companion video/audio using Deep Neural Network and Forced Alignment.
Project description
Dependencies
apt-get install ffmpeg espeak libespeak1 libespeak-dev espeak-data
or
brew install ffmpeg espeak
Installation
# Install from PyPI (pre-emptive NumPy)
pip install numpy
pip install subaligner
or
# Install from source
git clone git@github.com:baxtree/subaligner.git
cd subaligner
make install && source .venv/bin/activate
Usage
# Single-stage alignment (high-level shift with lower latency)
$ subaligner_1pass -v video.mp4 -s subtitle.srt
# Dual-stage alignment (low-level shift with higher latency)
$ subaligner_2pass -v video.mp4 -s subtitle.srt
The aligned subtitle will be saved at subtitle_aligned.srt
. For details on CLI, run subaligner_1pass --help
or subaligner_2pass --help
.
Supported Formats
Subtitle: SubRip, TTML and WebVTT
Video: MP4, WebM, Ogg, 3GP, FLV and MOV
Anatomy
Subtitles can be out of sync with their companion audiovisual media files for a variety of causes including latency introduced by Speech-To-Text on live streams or calibration and rectification involving human intervention during post-production.
A model has been trained with synchronised video and subtitle pairs and later used for predicating shifting offsets and directions under the guidance of a two-stage aligning approach.
First Stage (Global Alignment):
Second Stage (Parallelised Individual Alignment):
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Hashes for subaligner-0.0.6rc1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b5eaf54d8d8e3719f556bdd8f5a658736acf70d7eb8c0dabd1db45d1b2f399a9 |
|
MD5 | b96ba4ade87bbf5afb584aa0468417f3 |
|
BLAKE2b-256 | 79cfa65e27cbe46441f46b70a038762d2aef55d454d6024d92008b3eba1708db |