nntrainer
nntrainer copied to clipboard
[LoRA] add scaling parameter to LoRA
- This commit adds
scaling
parameter to LoRA (fc) - In the original paper, they adopted
alpha (int)
as a parameter to derive the scaling factor internally, i.e., scaling = alpha / rank. - However, in this commit it directly takes
scaling
as a hyper-parameter. - The updates are summarized as follows:
-
common_properties.h
: add LoraScaling as a parameter. -
fc_layer.cpp
: update forwarding / calcGradient / calcDerivative func to apply scaling factor in LoRA computation -
fc_layer.h
: update to take LoraScaling as fc_props -
node_exporter.cpp/h
: add LoRA Scaling as a parameter in tf.export format of fc layer
-
- [TODO] update tflite format to support the updated fc_layer params.
Self evaluation:
- Build test: [X]Passed [ ]Failed [ ]Skipped
- Run test: [X]Passed [ ]Failed [ ]Skipped
:memo: TAOS-CI Version: 1.5.20200925. Thank you for submitting PR #2539. Please a submit 1commit/1PR (one commit per one PR) policy to get comments quickly from reviewers. Your PR must pass all verificiation processes of cibot before starting a review process from reviewers. If you are new member to join this project, please read manuals in documentation folder and wiki page. In order to monitor a progress status of your PR in more detail, visit http://ci.nnstreamer.ai/.
:octocat: cibot: @EunjuYang, The last line of a text file must have a newline character. Please append a new line at the end of the line in nntrainer/layers/fc_layer.cpp.
:octocat: cibot: @EunjuYang, A builder checker could not be completed because one of the checkers is not completed. In order to find out a reason, please go to http://ci.nnstreamer.ai/nntrainer/ci/repo-workers/pr-checker/2539-202404110923550.41548609733582-1875c6da380617529ec42ea672e5f33a61f15a72/.