seq2seq
seq2seq copied to clipboard
Bucketed model with shared weights
Is there a way to share internal layers' weights of a Seq2Seq (preferably AttentionSeq2Seq) model in order to make a bucketed training of those, or it's supposed to be implemented by the user?:)
@farizrahman4u I've just managed to:
- create a model with explicitly specified input_length, output_length for one bucket
- save its weights
- create a model for another bucket (again, setting its input and output lengths explicitly)
- loaded the 1st model's weights into it
- fitted some data -- and nothing crashed:)
What is the meaning of input_length, output_length if they don't affect the topology? And is the way described above OK for training bucketed models?