E2-TTS in Pytorch
Project description
E2 TTS - Pytorch (wip)
Implementation of E2-TTS, Embarrassingly Easy Fully Non-Autoregressive Zero-Shot TTS, in Pytorch
You can chat with other researchers about this work here
Install
$ pip install e2-tts-pytorch
Usage
import torch
from e2_tts_pytorch import (
E2TTS,
DurationPredictor
)
duration_predictor = DurationPredictor(
transformer = dict(
dim = 512,
depth = 2,
)
)
x = torch.randn(1, 1024, 512)
duration = torch.randn(1,)
loss = duration_predictor(x, target_duration = duration)
loss.backward()
e2tts = E2TTS(
duration_predictor = duration_predictor,
transformer = dict(
dim = 512,
depth = 4,
skip_connect_type = 'concat'
),
)
loss = e2tts(x)
loss.backward()
sampled = e2tts.sample(x)
Citations
@inproceedings{Eskimez2024E2TE,
title = {E2 TTS: Embarrassingly Easy Fully Non-Autoregressive Zero-Shot TTS},
author = {Sefik Emre Eskimez and Xiaofei Wang and Manthan Thakker and Canrun Li and Chung-Hsien Tsai and Zhen Xiao and Hemin Yang and Zirun Zhu and Min Tang and Xu Tan and Yanqing Liu and Sheng Zhao and Naoyuki Kanda},
year = {2024},
url = {https://api.semanticscholar.org/CorpusID:270738197}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
e2_tts_pytorch-0.0.6.tar.gz
(174.6 kB
view hashes)
Built Distribution
Close
Hashes for e2_tts_pytorch-0.0.6-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7d6d05b84e8b13dd5f25fce84ccea1ae647e9a995f7dcc47c1f934bcfb11933d |
|
MD5 | 7dd19cb81fc44be81a747a6c846a23d8 |
|
BLAKE2b-256 | 2e7e1e431bf614e0da8247e105367d49a95c18798f2927bdb5ea2b9baec5bc45 |