ltu
ltu copied to clipboard
Question: Where is the setting to freeze the backbone LLM like LLaVA?
# LLaVA
if model_args.freeze_backbone:
model.model.requires_grad_(False)
In LTU code, only note the LLM has already frozen.
# for audio params, lora always trainable, llama always frozen
for name, param in model.named_parameters():
if trainable_params == 'all':
if "audio" in name:
param.requires_grad = True
#print(f"Parameter: {name}, requires_grad: {param.requires_grad}")
if trainable_params == 'proj':
if "audio_proj" in name:
param.requires_grad = True
#print(f"Parameter: {name}, requires_grad: {param.requires_grad}")