Active-Learning-Bayesian-Convolutional-Neural-Networks
Active-Learning-Bayesian-Convolutional-Neural-Networks copied to clipboard
Concerns about Deterministic Bald (Softmax_Bald?)
When reading the paper "Deep Bayesian Active Learning with Image Data", I was interested in the results of Figure 2. Specifically, I wanted to replicate the Bald VS Deterministic Bald part.
I followed the file-naming logic which lead me to the code in Softmax_Bald_Q10_N1000.py. So my first question is whether this code is the one behind the results of Deterministic Bald, since it uses predict()
instead of stochastic_predict()
?
Assuming I got it right, I wondered how the calculation of the average entropy have been made even though we only have one single instance of the predictions.
When looking at the code, for softmax_iterations = 1
, the values of G_X = Entropy_Average_Pi
and F_X = Average_Entropy
should be equal because there is no averaging operations involved. However, when I run the code, the values in U_X = G_X - F_X
where, in fact, not zeroed-out which they should have been.
Eventually, it turned out that the empty arrays created before the loop, namely score_All
and All_Entropy_Softmax
had the automatic dtype=np.float64
while the softmax_score
resulting from model.predict()
was of type np.float32
. Hence, subtracting these arrays, or any subsequent results would produce a non-zero difference.
To verify this, it's just a matter of prefixing the dtype
parameters as:
score_All = np.zeros(shape=(X_Pool_Dropout.shape[0], nb_classes), dtype=np.float32)
All_Entropy_Softmax = np.zeros(shape=X_Pool_Dropout.shape[0], dtype=np.float32)
Or removing the loop all together since it is only running for one iteration anyway.