aeon
aeon copied to clipboard
[ENH] Add AEDRNNNetwork
Implements Network based on Dilated Recurrent Neural Networks.
Thank you for contributing to aeon
I have added the following labels to this PR based on the title: [ $\color{#FEF1BE}{\textsf{enhancement}}$ ]. I have added the following labels to this PR based on the changes made: [ $\color{#379E11}{\textsf{networks}}$ ]. Feel free to change these if they do not properly represent the PR.
The Checks tab will show the status of our automated tests. You can click on individual test runs in the tab or "Details" in the panel below to see more information if there is a failure.
If our pre-commit code quality check fails, any trivial fixes will automatically be pushed to your PR unless it is a draft.
Don't hesitate to ask questions on the aeon Slack channel if you have any.
@MatthewMiddlehurst Do the tests look Okay?
Coverage reports are being uploaded now, see the checks.
Report for the network is here: https://app.codecov.io/gh/aeon-toolkit/aeon/pull/1577/blob/aeon/networks/_ae_drnn.py
Can we recheck the coverage? @MatthewMiddlehurst
You can navigate to it in the actions tab.
Now its 95% coverage, which is Okay I guess? @MatthewMiddlehurst
It would be good to cover the partials if possible, but I'm not going to demand 100%. Keep in mind testing is more than coverage, there probably not a lot to do here since its a single method with general testing also, but we do want to make sure that the output is correct and errors are properly raised as well where possible.
Looks good now? @hadifawaz1999
Looks good now? @hadifawaz1999
you seem to be dilating all encoder's layers now, why the change ? @aadya940
Oh by the way @hadifawaz1999, I forgot to tell you that dilation layers are placed as they currently are because this is the configuration that works with temporal_latent_space = True, this is a little different from the paper implementation because they don't use dilation in the decoder but we decided to use it. I hope it makes sense :))
If we aim for symmetrical _TensorDilation layer configurations for encoder and decoder, the output shape of the decoder deviates from what it should be eg: Instead of (100, 2), the output shape would turn out (50, 2). Thats why I've built the network in such a way that we don't have this issue.
Oh by the way @hadifawaz1999, I forgot to tell you that dilation layers are placed as they currently are because this is the configuration that works with
temporal_latent_space = True, this is a little different from the paper implementation because they don't use dilation in the decoder but we decided to use it. I hope it makes sense :))If we aim for symmetrical
_TensorDilationlayer configurations for encoder and decoder, the output shape of the decoder deviates from what it should be eg: Instead of (100, 2), the output shape would turn out (50, 2). Thats why I've built the network in such a way that we don't have this issue.
aaahh okay okay i see what you did there :) i understand why you did the condition now, okay keep it like that then thanks for the explanation
should merge main to remove pre-commit fail @aadya940
#1962