neogyk

Results 13 issues of neogyk

Many of the modern neural compression architectures utilize the arithmetic encoder-decoder functions (for example ANS). This can guarantee a higher compression rate. The arithmetic encoder module encodes the stream of...

Add the decorator functions for code profiling, energy consumption and CO(2) emission estimation.

Usage of half-precision speed ups the training process of DNN, as it allocates less memory.

MFlow integration for the experiment tracking

The [TopTag Dataset](https://zenodo.org/records/2603256) contains the 200 jets 4-momenta of TopQuark and QCD dijets. The example contain the whole pipeline for the training and compression this dataset.

Provided the tracking of the experiment using the mlflow. It will store the train, test loss values, regularization and learning rate. Also, the config file parameters will be stored to...

Torch Lightning proposes a compact the efficient way to train neural networks. It contained the predefined construction (Trainer, Tuner) that speeds up the code understanding and maintenance. Moreover, a lot...

#### Problem formulation: Currently, all the plots produced by Baler are stored in the experiment directory. As the size of the validation set or the number of examples increases, the...

Add the support to read and load the external predefined torch.utils.data.Dataset.