batch-normalization topic
tensorflow-mnist-MLP-batch_normalization-weight_initializers
MNIST classification using Multi-Layer Perceptron (MLP) with 2 hidden layers. Some weight-initializers and batch-normalization are implemented.
AmusingPythonCodes
Interesting python codes to tackle simple machine/deep learning tasks
batchnorm_prune
Tensorflow codes for "Rethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution Layers"
SDPoint
Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks
LoL-Match-Prediction
Win probability predictions for League of Legends matches using neural networks
DeepLearning_from_scratch
A Deep Learning framework for CNNs and LSTMs from scratch, using NumPy.
Audio-Classification-Using-Wavelet-Transform
Classifying audio using Wavelet transform and deep learning
NeuralNetwork
Neural Network implementation in Numpy and Keras. Batch Normalization, Dropout, L2 Regularization and Optimizers
Sandwich-Batch-Normalization
[WACV 2022] "Sandwich Batch Normalization: A Drop-In Replacement for Feature Distribution Heterogeneity" by Xinyu Gong, Wuyang Chen, Tianlong Chen and Zhangyang Wang
Training-BatchNorm-and-Only-BatchNorm
Experiments with the ideas presented in https://arxiv.org/abs/2003.00152 by Frankle et al.