r0

Results 7 comments of r0
trafficstars

Do you think #536 would help here?

Hi @sanchit-gandhi and @connor-henderson I saw the PR, but I was wondering if we also integrated `always_use_initial_prompt` and `condition_on_previous_text` to the API? If no, is there any active work going...

Okay, in case someone is not able to find it, you'll have to manually download encoded weights from torch hub: ```python import torch url = 'https://dl.fbaipublicfiles.com/encodec/v0/encodec_24khz-d7cc33bc.th' state = torch.hub.load_state_dict_from_url(url, map_location='cpu',...

@imohitmayank I would also suggest to add `ensure_weight_tying` flag as True in `LoraConfig` if you add the embedding layer in `modules_to_save`. This would keep the weight tying consistent and mark...

@imohitmayank Can you try `ensure_weight_tying` flag with `modules_to_save`? Instead of passing `trainable_tokens`, can you please try passing `embed_tokens` layer to `modules_to_save`?

@imohitmayank Yes, you are correct. I am not sure what should be done here from PEFT side. @BenjaminBossan would be the correct person for that. But as far as we...