Video-ChatGPT
Video-ChatGPT copied to clipboard
Is the linear layer initialized by llava's linear layer?
Is the linear layer initialized by the linear layer of llava? I found that the pretrain_mm_mlp_adapter parameter is not set in the script. Does it mean that the linear layer is not initialized by llava?