Chengxi Li
Chengxi Li
我使用的场景比较特殊,在普通场景下使用docker暴露端口或者端口转发都是ok的,但是在我的场景下暴露的端口形成的链接是一个hash后的字符串,如xxxx:pp会变成xxxx/hhhhhh,所以如http://xxxx/table的链接就会被拒绝,只能响应http://xxxx/hhhhhhh开头的链接,我手动改了访问url为http://xxxx/hhhhhhhh/table后就能收到table页面了,但是其中css、js可视化的文件路径等都还是http://xxxx/开头,所以这样也并没有什么效果。我简单看了下fitlog实现相关的代码,可能是flask部署配置的问题。
Check this closed issue and my reply. We will upload this phone_set file to the repo later. https://github.com/MoonInTheRiver/DiffSinger/issues/15#issuecomment-1041207925
> How to binarize the new testset using the existing phone_set.json rather than generating a new phone_set? See this line in binarize.py https://github.com/MoonInTheRiver/DiffSinger/blob/69ccf417e82834b6f4fc474e046f3bbaa79b3827/data_gen/singing/binarize.py#L104 Change the "reset_phone_dict" in your config file...
能否提供下更多细节,是否把fft-singer,diffsinger和vocoder的ckpt都置于了checkpoints文件夹中,是否正确指定了EXP_NAME
我用双卡double了一下batch size per gpu,微调了lr之后可以复现ACE2005上的效果
其他都是repo里提供的ace05.sh里的原参数了 -----原始邮件----- 发件人:zyl535544721 ***@***.***> 发送时间:2021-05-13 17:01:55 (星期四) 收件人: ShannonAI/mrc-for-flat-nested-ner ***@***.***> 抄送: MrZixi ***@***.***>, Comment ***@***.***> 主题: Re: [ShannonAI/mrc-for-flat-nested-ner] Reproducing the ACE2005 Results (#83) 我用双卡double了一下batch size per gpu,微调了lr之后可以复现ACE2005上的效果 能说下具体参数吗? 谢谢 —...
This is the config.yaml left by training logs. And I ran it with 2 V100 32G gpus. -------------------------------------------------------------------------------------- accumulate_grad_batches: 1 adam_epsilon: 1.0e-08 amp_backend: native amp_level: O2 auto_lr_find: false auto_scale_batch_size: false...
When using our configs on your dataset, Please do check the "binary_data_dir" in hparams to make sure it points to your binarized data directory because the phoneme dictionary text file...
Sorry I may have misunderstood your issue. If you want to infer from our pretrained ckpt, please make sure your phoneme dictionary is exactly the same as ours because some...
If you want to use customed phoneme dictionary, please follow our guidance and re-run the training.