CogVLM
CogVLM copied to clipboard
[CogVLM-chat-v1.1] LM weights are different with vicuna-7b-v1.5
While CogVLM is trained, LM weights are fronzen.
From my observation however, the LM weights of cogvlm are different with Vicuna
Vicuna: https://huggingface.co/lmsys/vicuna-7b-v1.5/tree/main CogVLM: cogvlm-chat-v1.1 (both from HF or SAT)
Can I ask why or the proper source of the language model?
- CogVLM-Chat-v1.1 (SAT)
- Vicuna-7B-v1.5