SincNet
SincNet copied to clipboard
Hamming window simplification
https://github.com/mravanelli/SincNet/blob/master/dnn_models.py#L106-L108 :
#self.window_ = torch.hamming_window(self.kernel_size)
n_lin=torch.linspace(0, (self.kernel_size/2)-1, steps=int((self.kernel_size/2))) # computing only half of the window
self.window_=0.54-0.46*torch.cos(2*math.pi*n_lin/self.kernel_size);
Could it be replaced instead by self.window_ = torch.hamming_window(kernel_size)[:kernel_size // 2]
? Or should it be self.window_ = torch.hamming_window(kernel_size, periodic = False)[:kernel_size // 2]
?
There is some divergence. Is it only because of numerical reasons? Or maybe some boundary effects of torch.linspace? Here's PyTorch source code for: torch.hamming_window
import math
import torch
kernel_size = 251
window1 = 0.54 - 0.46 * torch.cos(2 * math.pi * torch.linspace(0, kernel_size / 2 - 1, steps=int((kernel_size / 2))) / kernel_size)
window2 = torch.hamming_window(kernel_size)[:kernel_size // 2]
print((window1 - window2).abs().max())
# tensor(0.0034)