out proj weight typo fix
Here is a typo for initialize_parameter() function. The output_proj.weight should be output_proj_weight
https://github.com/openai/CLIP/blob/dcba3cb2e2827b402d2701e7e1c7d9fed8a20ef1/clip/model.py#L321
What makes you think so?
What makes you think so?
@99991 before it is a parameter defined here https://github.com/openai/CLIP/blob/dcba3cb2e2827b402d2701e7e1c7d9fed8a20ef1/clip/model.py#L85
The line from your first message initializes the member block.attn.out_proj.weight of self.transformer.resblocks
- https://github.com/openai/CLIP/blob/dcba3cb2e2827b402d2701e7e1c7d9fed8a20ef1/clip/model.py#L321
- https://github.com/openai/CLIP/blob/dcba3cb2e2827b402d2701e7e1c7d9fed8a20ef1/clip/model.py#L319
- https://github.com/openai/CLIP/blob/dcba3cb2e2827b402d2701e7e1c7d9fed8a20ef1/clip/model.py#L282
- https://github.com/openai/CLIP/blob/dcba3cb2e2827b402d2701e7e1c7d9fed8a20ef1/clip/model.py#L200
- https://github.com/openai/CLIP/blob/dcba3cb2e2827b402d2701e7e1c7d9fed8a20ef1/clip/model.py#L175
- https://github.com/pytorch/pytorch/blob/3f77002b968c093b2b668fed24d85f6a365d6b3c/torch/nn/modules/activation.py#L1102
But in your last message, you highlight self.c_proj.weight, which is not a member of self.transformer.resblocks. Instead, it is a member of AttentionPool2d, which is only used in ModifiedResNet, i.e. self.visual of CLIP.
- https://github.com/openai/CLIP/blob/dcba3cb2e2827b402d2701e7e1c7d9fed8a20ef1/clip/model.py#L85
- https://github.com/openai/CLIP/blob/dcba3cb2e2827b402d2701e7e1c7d9fed8a20ef1/clip/model.py#L58C7-L58C22
- https://github.com/openai/CLIP/blob/dcba3cb2e2827b402d2701e7e1c7d9fed8a20ef1/clip/model.py#L127
- https://github.com/openai/CLIP/blob/dcba3cb2e2827b402d2701e7e1c7d9fed8a20ef1/clip/model.py#L264