parler-tts icon indicating copy to clipboard operation
parler-tts copied to clipboard

Inference and training library for high-quality TTS models.

Results 42 parler-tts issues
Sort by recently updated
recently updated
newest added

If users who wish for a locally deployed gradio ui, the new README file will guide them. Added License at the end of README file

Hey @sanchit-gandhi, like the repo. Excited to see this being worked on. Here's a benchmark of WhisperSpeech. I used your sample script on the same exact text snippet and it...

`Repository` is a deprecated feature in `huggingface_hub`. It is more robust to use the HTTP-based methods from `HfApi` instead. [Here is a doc page explaining why it's better](https://huggingface.co/docs/huggingface_hub/concepts/git_vs_http). Main reasons...

also moved it to bf16 which results in a substantial speedup over fp32

with newer pytorch (2.4 nightly) we get bfloat16 support in MPS. i tested this: ```py from parler_tts import ParlerTTSForConditionalGeneration from transformers import AutoTokenizer import soundfile as sf import torch device...

(base) gwen@GwenSeidr:~/2/parler-tts$ virtualenv parler_tts_env created virtual environment CPython3.10.12.final.0-64 in 328ms creator CPython3Posix(dest=/home/gwen/2/parler-tts/parler_tts_env, clear=False, no_vcs_ignore=False, global=False) seeder FromAppData(download=False, pip=bundle, setuptools=bundle, wheel=bundle, via=copy, app_data_dir=/home/gwen/.local/share/virtualenv) added seed packages: GitPython==3.1.43, Jinja2==3.1.3, Markdown==3.6, MarkupSafe==2.1.5, PyYAML==6.0.1,...

I want to make an app that would read long texts in chunks. For this I need to get the same voice for the same speaker prompt. Now I get...

Hi, Would be nice to be able to use this using the `text-to-speech` pipeline. Thanks!

Hi, Congrats on the release!! Is long form synthesis planned? Thank you!

I recently stumbled upon your project and I'm excited about its potential. I'm wondering if there are any plans to add French language support in the future.