XVFI icon indicating copy to clipboard operation
XVFI copied to clipboard

Questions about Shared Parameters

Open nemoHy opened this issue 3 years ago • 3 comments

Hi! Congratulations! I got a question that why you try to share paramters between those sub-networks. Is there any other motivations except for just reducing the number of paramters, or maybe some theories, explainations and experiments on it? I will be appreciated if you could reply as soon as you could .

nemoHy avatar Nov 30 '21 11:11 nemoHy

Thank you for your relpy! @JihyongOh Acutally, I wonder if parameters can be shared among those sub-networks. Well, results shows that this method is useful. I think, there shuold be a reason why those parameters can be shared. Maybe this method can be applied to other similar tasks.

nemoHy avatar Dec 01 '21 11:12 nemoHy

@nemoHy To clarify, three sub-networks (BiFlownet, TFlownet, Refinement Block) of Fig. 4 are not shared each other, but can be shared across scale levels as in Fig. 3. You can also check that those three sub-networks are independent (not shared) in provided PyTorch code.

JihyongOh avatar Dec 02 '21 01:12 JihyongOh

Thank you for your reply again! Sorry for my inappropriate expression. @hjSim @JihyongOh I know that three sub-networks (BiFlownet, TFlownet, Refinement Block) of Fig. 4 are not shared each other, but my question is about the sharing across scale levels. I know it can save parameters and works well in pratice, but is there any theoretical explanation why it can be shared across scales and works so well.

nemoHy avatar Dec 02 '21 03:12 nemoHy