alpha_mix_active_learning
alpha_mix_active_learning copied to clipboard
The official implementation of Active Learning by Feature Mixing (ALFA-Mix) paper
Hi authors, thank you for your great work! I have a doubt on the final part of the query function: ```python if len(selected_idxs) < n: remained = n - len(selected_idxs)...
Hi @AminParvaneh, thank you for sharing your very interesting work! Just want to pick your brains/check my understanding. Assuming we want to come up with an uncertainty measure/score, aka, measure...
``` loss = F.cross_entropy(out, pred_1.to(self.device)) grads = torch.autograd.grad(loss, var_emb)[0].data.cpu() ``` In my understanding both **out** and **pred_1** hold the same passage of the unlabeled samples through the model, but the...
Dear author, On line 23 in alpha_mix_sampling.py ulb_probs, org_ulb_embedding = self.predict_prob_embed(self.X[idxs_unlabeled], self.Y[idxs_unlabeled]) By default, we can obtain labels for unlabeled data in the code, but in reality, we should not...
Thank you for your exciting work. But when I use your code, it suffers from endless warning by openblas, like following:  After find candidates and before 'Number of samples...
I got the mentioned results for the dataset MNIST, but when I use the CIFAR100 dataset with these parameters: --data_name CIFAR100 --data_dir your_data_directory --log_dir your_log_directory \ --n_init_lb 1000 --n_query 1000...
Dear authors, python main.py \ --data_name CIFAR100 --data_dir dataset --log_dir log_output \ --n_init_lb 1000 --n_query 1000 --n_round 10 --learning_rate 0.001 --n_epoch 50 --model vit_small \ --strategy All --alpha_opt --alpha_closed_form_approx --alpha_cap...
Hi, could you let me know how I can find the value of self.args? I couldn't find similar names in the rest of the files. For example, if you mention...