Rohit Kr Singh
Results
2
issues of
Rohit Kr Singh
Hi, below are the hyperparameters defined under [finetune/adapter.py](https://github.com/Lightning-AI/lit-gpt/blob/main/finetune/adapter.py) ``` # Hyperparameters learning_rate = 3e-3 batch_size = 64 / devices micro_batch_size = 4 gradient_accumulation_iters = batch_size // micro_batch_size assert gradient_accumulation_iters >...
Hi Team, I'm fine-tuning a falcon-7B model using LoRA. I can see a PR #118 - "Use FSDP Everywhere" was made 3 weeks ago. The implementation is done for adapter...