Philippe Rémy
Philippe Rémy
@lminer done: https://github.com/philipperemy/keras-tcn/commit/07e85e3f6a1c836af942956eb299841588dac77d
I'll close this as I don't think we need to do any dev on that one. Re-open if I'm wrong.
@HouKun-github hey could it be linked to https://github.com/philipperemy/keras-tcn/pull/133?
Yes, you are right. The 1-D fully convolutional layer is gone and I think it is because we found that removing it boosted the performance (or did not deteriorate the...
@daniel-v-e I am not sure how hard it is to implement it. I forgot a bit how `Bidirectional` works. Is it concatenating the outputs at each step from two RNNs...
@daniel-v-e Bidirect has been implemented. You can check it here: https://github.com/philipperemy/keras-tcn/commit/a412190a56dc6ebe67f16d5c63a49ac3d4eb1ba7. Ref: https://keras.io/examples/nlp/bidirectional_lstm_imdb/
Pushed in 3.5.0.
@murphycrosby have you ever had the chance to try on a more recent version of TF?
@mimxrt not sure exactly why it does not work. This should work in theory: ```python m_in = K.Input(shape=(None, 2), batch_size=batch_size, ragged=True) ``` Ref: https://github.com/tensorflow/tensorflow/issues/27170
@mimxrt yes it does not work for me too. But seems like it's a Keras issue here.