LLaVA-NeXT
LLaVA-NeXT copied to clipboard
non-meta parameter vs meta parameter
for vision_model.post_layernorm.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?) How can this problem be solved?