nlp_course
nlp_course copied to clipboard
Fix gpu loading for multi-GPU in LoRA, loading to cpu in week 10 + minor typing fixes and typos
Fixed the issue In the PEFT homework, namely, custom LoRA layers which are added to the network in the case of multiple GPU training, new low-rank tensors are placed on a device different to the full weight tensors. It caused the exception.
Also fixed some typos which could find and also added notations in functions where it can concern users.
Check out this pull request on
See visual diffs & provide feedback on Jupyter Notebooks.
Powered by ReviewNB
Thank you for the proposed changes. To make this PR suitable for testing and review, please, re-submit it in clean and focused manner:
- one PR per lesson
- no unneeded merge commits
- no metadata changes