Solunex
Solunex
More information about your System would be neccesary for a appropriate answer. Like for Example: VRAM And a Nvidia Card is required for GPU inference. If you don't have a...
And you should have a way to get the Error. There is a lot of things that could go wrong with Neural Network Inference. Try Searching for "Python remote error...
Hello @s-tweed i have made a command for you: python -m torch.distributed.run --nproc_per_node=1 --master_port=10902 --master_addr=localhost train.py --c config.json --model Model-1
Should work fine i Tested it On windows/linux(docker).
This might help you: -> https://github.com/myshell-ai/MeloTTS/blob/main/docs/training.md You have to change the configs accordingly.