ludwig
ludwig copied to clipboard
Fix to resume adapter training from an existing adapter weights
Minor Changes to enable continue training a previously trained adapter. This should resolve #3833
@arnavgarg1 I have made these changes to resume training on already trained adapter (using safetensors weights and not the checkpoint) which works for me, now the question I have for you is, will this take care of the TODO mentioned here, if not, does this PR sits well with current arch design ethos of the project, if this PR is acceptable, I will go ahead, and add any tests as required for this PR
Unit Test Results
5 files - 1 5 suites - 1 11m 32s :stopwatch: - 5m 35s 12 tests ± 0 7 :heavy_check_mark: - 2 5 :zzz: +2 0 :x: ±0 48 runs - 12 23 :heavy_check_mark: - 19 25 :zzz: +7 0 :x: ±0
Results for commit 6fc53d63. ± Comparison against base commit b6df7151.
This pull request skips 2 tests.
tests.regression_tests.benchmark.test_model_performance ‑ test_performance[ames_housing.gbm.yaml]
tests.regression_tests.benchmark.test_model_performance ‑ test_performance[mercedes_benz_greener.gbm.yaml]