nntrainer
nntrainer copied to clipboard
NNtrainer is Software Framework for Training Neural Network Models on Devices.
Now we are in need of identity layer(do nothing flow everything). This is specifically to solve the issue addressed here https://github.com/nnstreamer/nntrainer/commit/e76a186c8b4a7bea1d8d30ffd9f208cfd2ec78e8#r62007856 ``` A -> outputs ->(a0, a1) B -> outputs...
1. Checkout delayed items from #1033 2. related to #1516, add support for `ml_train_set_compile_properties()`, `ml_train_set_run_properties()` officially to overcome var_args limitation. documentation 1. ini inference default behavior https://github.com/nnstreamer/nntrainer/pull/1711#discussion_r755199205
Support for an independent learning rate scheduler interface is needed. - [x] create learning rate scheduler interface - must be easy to support custom scheduler #1776 - [x] support existing...
Sharing a given label with the loss layers of the model is not fully supported. - when only a single label is given, it is shared with all the labels...
layer plugin developer has some burden to create and use the layer right away. let's assume there is a custom layer ```cpp class MyLayer : public nntrainer::Layer { /** define...
docs and example how to use custom optimizer to the public
https://github.com/nnstreamer/api/blob/95faa11b553861a42cedb4cae441f87a9ec353ce/c/src/ml-api-inference-single.c#L833-L853 ```C if (nnfw == ML_NNFW_TYPE_NNTR_INF) { if (!in_tensors_info || !out_tensors_info) { if (!in_tensors_info) { ml_tensors_info_h in_info; status = ml_tensors_info_create (&in_info); ////////// 1. if (status != ML_ERROR_NONE) { goto error;...
For the model which is using reparameterization trick (eg, variational auto encoder) does not take label but calculates loss. loss should not determined by `requireLabel()` but solely by `outconnection ==...
For now checking the tensor which is contiguous or not is disabled since the add_strided does not support broadcast. Enable this one after support broadcast operation and then replace add_i...