nilmtk-contrib icon indicating copy to clipboard operation
nilmtk-contrib copied to clipboard

RNN default sequence length

Open klemenjak opened this issue 4 years ago • 3 comments

Hi,

not an issue, just a question. I recognized that the pre-set sequence length for RNN is 19, whereas it's 99 for all others. Is that on purpose? If so, why?

class RNN(Disaggregator):

    def __init__(self, params):
        """
        Parameters to be specified for the model
        """

        self.MODEL_NAME = "RNN"
        self.models = OrderedDict()
        self.chunk_wise_training = params.get('chunk_wise_training',False)
        self.sequence_length = params.get('sequence_length',19)

RNN works fine for me, I'm just curious why 19 is set as default.

best, C

klemenjak avatar Aug 27 '20 14:08 klemenjak

Apparently the commit that changed that was squashed with a bunch of other things: https://github.com/nilmtk/nilmtk-contrib/commit/36d4ec201ac7c77a3b5662fcc09548e0bfd53eee#diff-5292385d52af1868a9aa740238f71e71L37

@Rithwikksvr, do you remember that?

The original RNN paper uses longer sequences (still varying according to the appliance).

PMeira avatar Aug 27 '20 14:08 PMeira

@PMeira @klemenjak

I don't remember making the sequence length as 19.

There's no reason to choose the sequence length as 19. It must have been a mistake!

Rithwikksvr avatar Aug 27 '20 15:08 Rithwikksvr

@Rithwikksvr Thanks for the quick reply!

From a glance, the other changes from the commit look good. Let's keep this open until that is checked and fixed. I have a clean-up draft at https://github.com/nilmtk/nilmtk-contrib/pull/32 and could add a commit there.

PMeira avatar Aug 27 '20 21:08 PMeira