keras-tcn
keras-tcn copied to clipboard
fully convolution layer
Why is this version of the one-dimensional fully convolution layer gone? The previous version had a one-dimensional full convolution, at least the original paper used full convolution to ensure that the sequence is not missing, can you explain?
@HouKun-github hey could it be linked to https://github.com/philipperemy/keras-tcn/pull/133?
Yes, you are right. The 1-D fully convolutional layer is gone and I think it is because we found that removing it boosted the performance (or did not deteriorate the performance of the model, cf. Occam razor). I agree that we don't match the original paper 100% now because of this change.