Alexey Korepanov
Alexey Korepanov
@hulucky1102 You can look here for an example on how to inference streaming version correctly https://github.com/grazder/DeepFilterNet/blob/1097015d53ced78fb234e7d7071a5dd4446e3952/torchDF/test_torchdf.py#L70 Also, check that you export with `always_apply_all_stages=True` parameter: https://github.com/grazder/DeepFilterNet/blob/1097015d53ced78fb234e7d7071a5dd4446e3952/torchDF/model_onnx_export.py#L232
Is it happening only with ONNX inference? Or with torch inference of streaming version too?
Hello, we need to store two future frames for a single frame prediction, so `self.df_order - 1` is correct Also you can find it here: https://github.com/Rikorose/DeepFilterNet/blob/f2445da10ce7760ac41d272ce4699200333a6e32/libDF/src/tract.rs#L586
@FisherDom > will there be any problems when converting the onnx model to tf model I think that you cat face problems with RFFT / IRFFT, I don't know a...
@FisherDom Hello! I've tried to quantify with ONNX here - https://github.com/grazder/DeepFilterNet/blob/torchDF-temp/torchDF/model_onnx_export.py But it didn't gave me anything, seems like old ops became faster, but because of many quantize / dequantize...
Same problem
https://github.com/pytorch/audio/issues/3731 Related?
@pallaswept Do you have full list of dependencies?
Seems like `./configure` should check everything, but it says that everithing is OK ``` ------------------------------------------------------------------------ rnnoise 0.2: Automatic configuration OK. Assertions ................... no Hidden visibility ............ yes API code examples...