burn
burn copied to clipboard
Support Enum Modules
Feature description
Support deriving Module for enums to enable code such as
#[derive(Module, Debug, Clone)]
enum Layer<B: Backend> {
Slow(LayerBig<B>),
Fast(LayerSmall<B>),
}
Feature motivation
This feature would make it easy to switch between layer variants and enable storing a vector of layers of different types.
Yesss !
Related ticket: https://github.com/burn-rs/burn/issues/583. Someone shared how to use enum for sequential forward.
Someone filed a ticket regarding this: https://github.com/Tracel-AI/burn/issues/983
Hey folks,
My case is that I have a Pytorch model with Sequential and without Enum type support with derive, it would mean implementing Module by hand just to support a Vec<CustomEnum>. Which means you'd have to implement Module for the enum itself, then Record for the module record struct and Item for a item struct if I understand correctly. I had a look and all of this doesn't seem super trivial?
It seems that, loading something like this is not really trivial at the moment.
import torch
class CustomLayer(torch.nn.Module):
def __init__(self, channels, val):
super().__init__()
self.something = torch.ones(1, channels, 1) * val
def forward(self, x):
return x * self.something
class Net(torch.nn.Module):
def __init__(self):
super().__init__()
self.block = torch.nn.Sequential(
*[torch.nn.Conv1d(1, 32, 7), torch.nn.Tanh(), torch.nn.ConvTranspose1d(32, 22, 8), CustomLayer(22, 100)]
)
def forward(self, x):
return self.block(x)
model = Net()
res = model(torch.ones(1,1,400))
torch.save(model.state_dict(), "dummy.pt")
I haven't tried to see if the PyTorchRecorder would actually allow this if I implemented Module for an enum manually because I haven't grasped yet how to do that as mentioned above. I appreciate that my case this point is focused in part on the PyTorchRecorder capability but it seems like it depends on Module derive for Enums being implemented just to test it.
Are there any examples in the codebase I'm missing that can be used to guide an implementation of something like this from scratch?
@finnkauski, based on your example, I believe it would be optimal to encapsulate each module within a struct. The usage of Sequential in your example seems to focus on automatically generating a forward pass rather than facilitating the addition of a dynamic number of different modules in a list. If you intend to load weights from a PyTorch recorder, a straightforward approach would be to remap block.seq.0 to block.conv1d, and so forth.
@finnkauski, based on your example, I believe it would be optimal to encapsulate each module within a struct. The usage of
Sequentialin your example seems to focus on automatically generating a forward pass rather than facilitating the addition of a dynamic number of different modules in a list. If you intend to load weights from a PyTorch recorder, a straightforward approach would be to remapblock.seq.0toblock.conv1d, and so forth.
I think this is a toy example, there are dynamic things at play in the real model which dictate the structure of the model and how many of certain layers there would be based on say the config values.
Closed with #1337
I you're looking for named enum support, see #1343