Damien Lancry
Damien Lancry
I think it s a BCNN because of the dropout layers
Dropout is commonly used during training as a regularization technique to avoid overfitting since 2012. however it is not common to use dropout at test time. dropout layers are usually...
I think so yeah
It is a method they implemented themselves, it is not implemented in the common keras available with pip, that is why they included their own version of keras in this...
python Dropout_Bald_Q10_N1000_Paper.py
I am having the same issue. did you manage to fix your problem? @darkaero-xx EDIT: I managed to fix my issue by mounting the jars in a volume in another...
in uncertainty sampling, we rank the data points by informativeness as measured by an acquisition function such as entropy or margin. In case there are several data points that are...
I think it comes from the function `select_cold_start_instance` in which there is a reshape: ``` return best_coldstart_instance_index, X[best_coldstart_instance_index].reshape(1, -1) ``` try using X_pool.values ? otherwise, there should be a check...
scikit learn's transformations are learned, but every transformation I can think about is learned in an unsupervised fashion (pca, lda, svd, tfidf, ...). In that case they don't introduce bias,...