nntrainer
nntrainer copied to clipboard
NNtrainer is Software Framework for Training Neural Network Models on Devices.
Commit 1: [bn layer] refactoring bn layer - Refactoring bn layer - Added common_function directory Signed-off-by: hyeonseok lee Commit 2: [lstmcell] move lstmcell core to common_function - Move lstmcell_core.* to...
- implement multi head attention finalizeCommon, finalize Signed-off-by: hyeonseok lee
- Implement multi head attention calcDerivative, calcGradient - Needs to support attention mask Signed-off-by: hyeonseok lee
- implement multi head attention calcCommonDerivative Signed-off-by: hyeonseok lee
- implement multi head attention forwarding Signed-off-by: hyeonseok lee
- Added layer/model unittest for multi head attention Signed-off-by: hyeonseok lee
[CAPI] Add attention, mol attention and multi head attention to CAPI - Add ML_TRAIN_LAYER_TYPE_ATTENTION - Add ML_TRAIN_LAYER_TYPE_MOL_ATTENTION - Add ML_TRAIN_LAYER_TYPE_MULTI_HEAD_ATTENTION **Self evaluation:** 1. Build test: [X]Passed [ ]Failed [ ]Skipped...
- Added newly implemented layer enum to nntrainer-api-common.h Signed-off-by: hyeonseok lee
When Export ```NNTrainer``` to ```Tensorflow Lite``` we need to reordering --> Implement based on #1892 - [x] Implement Flatten Reordering - [x] Test various FC Case - [x] Make Unit...
It add two functions: - provide PROFILE_MEM_ANNOTATE macro - print average and maximum usage of memory. Signed-off-by: Jiho Chu