LoRA
LoRA copied to clipboard
T(w) problem
I'm confused about why T(w) is the inner function https://github.com/microsoft/LoRA/blob/dc5d1744fa9430edda10bc233a9efc65e9239f50/loralib/layers.py#L128
torch.jit.script will report error: torch.jit.frontend.UnsupportedNodeError: function definitions aren't supported: def forward(self, x: torch.Tensor): def T(w): ~~~ <--- HERE return w.transpose(0, 1) if self.fan_in_fan_out else w if self.r > 0 and not self.merged:
is there any method to fix it, writing the T(w) as the Class function is working