aeon icon indicating copy to clipboard operation
aeon copied to clipboard

[ENH] Add AEDRNNNetwork

Open aadya940 opened this issue 1 year ago • 13 comments

Implements Network based on Dilated Recurrent Neural Networks.

aadya940 avatar May 27 '24 15:05 aadya940

Thank you for contributing to aeon

I have added the following labels to this PR based on the title: [ $\color{#FEF1BE}{\textsf{enhancement}}$ ]. I have added the following labels to this PR based on the changes made: [ $\color{#379E11}{\textsf{networks}}$ ]. Feel free to change these if they do not properly represent the PR.

The Checks tab will show the status of our automated tests. You can click on individual test runs in the tab or "Details" in the panel below to see more information if there is a failure.

If our pre-commit code quality check fails, any trivial fixes will automatically be pushed to your PR unless it is a draft.

Don't hesitate to ask questions on the aeon Slack channel if you have any.

aeon-actions-bot[bot] avatar May 27 '24 15:05 aeon-actions-bot[bot]

@MatthewMiddlehurst Do the tests look Okay?

aadya940 avatar Jun 03 '24 08:06 aadya940

Coverage reports are being uploaded now, see the checks.

Report for the network is here: https://app.codecov.io/gh/aeon-toolkit/aeon/pull/1577/blob/aeon/networks/_ae_drnn.py

MatthewMiddlehurst avatar Jun 04 '24 10:06 MatthewMiddlehurst

Can we recheck the coverage? @MatthewMiddlehurst

aadya940 avatar Jun 04 '24 11:06 aadya940

You can navigate to it in the actions tab.

MatthewMiddlehurst avatar Jun 04 '24 14:06 MatthewMiddlehurst

Now its 95% coverage, which is Okay I guess? @MatthewMiddlehurst

aadya940 avatar Jun 05 '24 07:06 aadya940

It would be good to cover the partials if possible, but I'm not going to demand 100%. Keep in mind testing is more than coverage, there probably not a lot to do here since its a single method with general testing also, but we do want to make sure that the output is correct and errors are properly raised as well where possible.

MatthewMiddlehurst avatar Jun 05 '24 13:06 MatthewMiddlehurst

Looks good now? @hadifawaz1999

aadya940 avatar Jul 29 '24 08:07 aadya940

Looks good now? @hadifawaz1999

you seem to be dilating all encoder's layers now, why the change ? @aadya940

hadifawaz1999 avatar Jul 30 '24 07:07 hadifawaz1999

Oh by the way @hadifawaz1999, I forgot to tell you that dilation layers are placed as they currently are because this is the configuration that works with temporal_latent_space = True, this is a little different from the paper implementation because they don't use dilation in the decoder but we decided to use it. I hope it makes sense :))

If we aim for symmetrical _TensorDilation layer configurations for encoder and decoder, the output shape of the decoder deviates from what it should be eg: Instead of (100, 2), the output shape would turn out (50, 2). Thats why I've built the network in such a way that we don't have this issue.

aadya940 avatar Jul 30 '24 10:07 aadya940

Oh by the way @hadifawaz1999, I forgot to tell you that dilation layers are placed as they currently are because this is the configuration that works with temporal_latent_space = True, this is a little different from the paper implementation because they don't use dilation in the decoder but we decided to use it. I hope it makes sense :))

If we aim for symmetrical _TensorDilation layer configurations for encoder and decoder, the output shape of the decoder deviates from what it should be eg: Instead of (100, 2), the output shape would turn out (50, 2). Thats why I've built the network in such a way that we don't have this issue.

aaahh okay okay i see what you did there :) i understand why you did the condition now, okay keep it like that then thanks for the explanation

hadifawaz1999 avatar Jul 30 '24 12:07 hadifawaz1999

should merge main to remove pre-commit fail @aadya940

hadifawaz1999 avatar Aug 02 '24 19:08 hadifawaz1999

#1962

aadya940 avatar Aug 13 '24 10:08 aadya940