burn icon indicating copy to clipboard operation
burn copied to clipboard

Support Enum Modules

Open Gadersd opened this issue 1 year ago • 3 comments

Feature description

Support deriving Module for enums to enable code such as

#[derive(Module, Debug, Clone)]
enum Layer<B: Backend> {
    Slow(LayerBig<B>), 
    Fast(LayerSmall<B>), 
}

Feature motivation

This feature would make it easy to switch between layer variants and enable storing a vector of layers of different types.

Gadersd avatar Aug 29 '23 19:08 Gadersd

Yesss !

nathanielsimard avatar Aug 29 '23 19:08 nathanielsimard

Related ticket: https://github.com/burn-rs/burn/issues/583. Someone shared how to use enum for sequential forward.

antimora avatar Sep 08 '23 03:09 antimora

Someone filed a ticket regarding this: https://github.com/Tracel-AI/burn/issues/983

antimora avatar Nov 21 '23 15:11 antimora

Hey folks,

My case is that I have a Pytorch model with Sequential and without Enum type support with derive, it would mean implementing Module by hand just to support a Vec<CustomEnum>. Which means you'd have to implement Module for the enum itself, then Record for the module record struct and Item for a item struct if I understand correctly. I had a look and all of this doesn't seem super trivial?

It seems that, loading something like this is not really trivial at the moment.

import torch 

class CustomLayer(torch.nn.Module):
    def __init__(self, channels, val):
        super().__init__()
        self.something = torch.ones(1, channels, 1) * val 

    def forward(self, x):
        return x * self.something
        

class Net(torch.nn.Module):
    def __init__(self):
        super().__init__()
        self.block = torch.nn.Sequential(
            *[torch.nn.Conv1d(1, 32, 7), torch.nn.Tanh(), torch.nn.ConvTranspose1d(32, 22, 8), CustomLayer(22, 100)]
        )
    def forward(self, x):
        return self.block(x)


model = Net()
res = model(torch.ones(1,1,400))

torch.save(model.state_dict(), "dummy.pt")

I haven't tried to see if the PyTorchRecorder would actually allow this if I implemented Module for an enum manually because I haven't grasped yet how to do that as mentioned above. I appreciate that my case this point is focused in part on the PyTorchRecorder capability but it seems like it depends on Module derive for Enums being implemented just to test it.

Are there any examples in the codebase I'm missing that can be used to guide an implementation of something like this from scratch?

finnkauski avatar Feb 11 '24 11:02 finnkauski

@finnkauski, based on your example, I believe it would be optimal to encapsulate each module within a struct. The usage of Sequential in your example seems to focus on automatically generating a forward pass rather than facilitating the addition of a dynamic number of different modules in a list. If you intend to load weights from a PyTorch recorder, a straightforward approach would be to remap block.seq.0 to block.conv1d, and so forth.

nathanielsimard avatar Feb 12 '24 15:02 nathanielsimard

@finnkauski, based on your example, I believe it would be optimal to encapsulate each module within a struct. The usage of Sequential in your example seems to focus on automatically generating a forward pass rather than facilitating the addition of a dynamic number of different modules in a list. If you intend to load weights from a PyTorch recorder, a straightforward approach would be to remap block.seq.0 to block.conv1d, and so forth.

I think this is a toy example, there are dynamic things at play in the real model which dictate the structure of the model and how many of certain layers there would be based on say the config values.

finnkauski avatar Feb 12 '24 19:02 finnkauski

Closed with #1337

I you're looking for named enum support, see #1343

laggui avatar Feb 22 '24 12:02 laggui