TensorFlow.jl icon indicating copy to clipboard operation
TensorFlow.jl copied to clipboard

Batch Normalization

Open pevnak opened this issue 8 years ago • 1 comments

I have implemented Batch Normalization for Matrices with running average estimation of mean and variance as

function batch_normalization(x::Tensor{T},k,decay::T) where {T} pop_mean = Variable(zeros(T,k), trainable=false) pop_var = Variable(ones(T,k), trainable=false) if decay <1 batch_mean = mean(x,1) batch_var = sqrt(mean((x .- batch_mean).^2, 1)) + constant(T(1e-6)) pop_mean = assign(pop_mean, decay * pop_mean + (1 - decay) * batch_mean) pop_var = assign(pop_var,decay * pop_var + (1 - decay) * batch_var) end (x .- pop_mean) ./ pop_var end

The idea to freeze the mean and variance is to recreate the model while setting decay to 1. Alternatively, decay can be implemented as Tensor, which would allow the model to freeze the model by setting decay to one.

Does anyone know about better approach? Is there an interest to create PR for this? Let me know please.

pevnak avatar Sep 12 '17 12:09 pevnak

This package is looking to the match the Python API as closely as possible. So if that function exists in the Python API with the same semantics, then a PR would be very welcome.

malmaud avatar Sep 21 '17 18:09 malmaud