text-generation-webui icon indicating copy to clipboard operation
text-generation-webui copied to clipboard

silero_tts will not load if I am not connected to the internet

Open RandomInternetPreson opened this issue 2 years ago • 2 comments

Describe the bug

I used to be able to use this extension offline, but now I can't load the extension if I am not online. If I am online the extension loads just fine. The actual language models is saved on my machine via the .cache file: C:\Users\myself.cache\torch\hub\snakers4_silero-models_master\src\silero\model

The model name is called v3_en.pt, it's being cached on my machine and when I load the extension with an internet connection the miniconda console says that it's using the cached model, so I don't know why I NEED to be connected to the internet for it to work.

Is there an existing issue for this?

  • [x] I have searched the existing issues

Reproduction

Run this (change your install location as necessary) with and without an internet connection.

cd F:\OoBaboogaMarch17\text-generation-webui conda activate textgen python .\server.py --auto-devices --gptq-bits 4 --cai-chat --gptq-model-type LLaMa --extension silero_tts

Screenshot

I'm including two screenshots, one when I am connected to the internet, and one when I am not connected to the internet.

internetConnection no internet connection

Logs

See screenshots

System Info

Window 10, 4090, i9 13900, windows mode not wsl

RandomInternetPreson avatar Mar 23 '23 16:03 RandomInternetPreson

This ain't a bug, it's just raising an exception because it can't connect to the torch hub. It tries to either way, regardless of local cache.

However, it is possible to check if the cached .pt exists and substitute the loader

Brawlence avatar Mar 25 '23 16:03 Brawlence

Thank you Bralwence, this kind soul on Reddit has solved my issue and provided code: update there are two versions of the change. Mine is the first one and the original og Reddit change is below that. I couldn't get it to work without the comma they get it to work without the comman so 🤷‍♂️ try theirs first and if it doesn't work try mine I guess.

https://old.reddit.com/r/Oobabooga/comments/11zsw5s/anyone_know_how_to_load_the_silero_tts_extension/jdmvocy/

def load_model(): needs to be changed like this:

def load_model(): cache_path='C:/Users/Myself/.cache/torch/hub/snakers4_silero-models_master/' model_path = cache_path + "src/silero/model/" + params['model_id'] + ".pt" if Path(model_path).is_file(): model, example_text = torch.hub.load(repo_or_dir=cache_path, model='silero_tts', language=params['language'], speaker=params['model_id'], source='local', path = model_path, force_reload = True) else: model, example_text = torch.hub.load(repo_or_dir='snakers4/silero-models', model='silero_tts', language=params['language'], speaker=params['model_id']), model.to(params['device']) return model

def load_model():
cache_path='C:/Users/USER/.cache/torch/hub/snakers4_silero-models_master/'
model_path = cache_path + "src/silero/model/" + params['model_id'] + ".pt"
if Path(model_path).is_file():
model, example_text = torch.hub.load(repo_or_dir=cache_path, model='silero_tts', language=params['language'], speaker=params['model_id'], source='local', path = model_path, force_reload = True)
else:
model, example_text = torch.hub.load(repo_or_dir='snakers4/silero-models', model='silero_tts', language=params['language'], speaker=params['model_id'])
model.to(params['device'])
return model

RandomInternetPreson avatar Mar 25 '23 16:03 RandomInternetPreson

Can someone submit a PR with this change?

oobabooga avatar Mar 29 '23 02:03 oobabooga

@oobabooga done #628

Brawlence avatar Mar 29 '23 06:03 Brawlence