pytorch-lora icon indicating copy to clipboard operation
pytorch-lora copied to clipboard

Is this if statement necessary?

Open darthjaja6 opened this issue 10 months ago • 0 comments

https://github.com/hkproj/pytorch-lora/blob/a2bdeadabd4ffcaf99b61d31579b7c8abe3f1af4/lora.ipynb#L459

When I do

for name, param in net.named_parameters():
  print(name)

I got output

linear1.bias
linear1.parametrizations.weight.original
linear2.bias
linear2.parametrizations.weight.original
linear3.bias
linear3.parametrizations.weight.original

So it seems that anyways the parametrization doesn't add any named_params that has "lora" in path. And I wrote a simple experimental code:

import torch
import torch.nn as nn
import torch.nn.utils.parametrize as parametrize

class WeightParametrization(nn.Module):
    def __init__(self):
        super().__init__()

    def forward(self, weight):
        return weight * 2  # 简单的示例,实际上可以是更复杂的变换

class MyLinear(nn.Module):
    def __init__(self):
        super(MyLinear, self).__init__()
        self.linear = nn.Linear(10, 5)

    def forward(self, x):
        return self.linear(x)

model = MyLinear()
print("Before parametrization:")
for name, param in model.named_parameters():
    print(name, param.shape)

# 注册参数化
parametrization = WeightParametrization()
parametrize.register_parametrization(model.linear, 'weight', parametrization)

print("\nAfter parametrization:")
for name, param in model.named_parameters():
    print(name, param.shape)  # 可以看到权重参数的变化

So it seems that after registering parametrization, the named_params don't really get new param. What do you think? Did I miss something?

darthjaja6 avatar Apr 24 '24 22:04 darthjaja6