blueoil
blueoil copied to clipboard
add new preprocessing: PerImageStandardization
Currently, we supports only DivideBy255
preprocessor. However, internally, we've prepared PerImageStandardization
. @yasumura-lm san mentioned that sometimes this preprocessing method is is very effective.
Hence, we want to support PerImageStandardization
.
FYI, PerImageStandardization
might be also necessary for MNIST dataset.
About MNIST dataset:
When I used DivideBy255
, an accuracy for test data was very unstable.
But I used PerImageStandardization
, the result for test data was so good (stable and high accuracy).
Let's me conclude what we have to do.
- Allow users to choose to apply activation quantization at first layer of training or not in init command.
./blueoil.sh init
- If they want to apply -> do the quantization in the same way that we have done.
- If they do not want to apply
-> change the preprocessor method from
DivideBy255
toPerImageStandardization
and -> not apply activation quantization in the first layer.
@lm-jira Thanks. There are some little mistake about quantization.
* Allow users to choose to apply activation quantization at first layer of training or not in init command. `./blueoil.sh init`
Users to choose to apply quantization including activation and weight at first layer, not only activation quantization.
If they do not want to apply -> change the preprocessor method from DivideBy255 to PerImageStandardization and -> not apply activation quantization in the first layer.
If they do not want to apply
- not apply activation quantization in the first layer. more clearly, change the preprocessor method from DivideBy255 to PerImageStandardization.
- not apply weight quantization in the first layer. more clearly, change the classification network to accept to choose weight quantization at 1st or not.
PerImageStandardization
is still not possible to apply in the blueoil init
command. I think it's better to keep this issue for now.