alpha_mix_active_learning icon indicating copy to clipboard operation
alpha_mix_active_learning copied to clipboard

Error when using other model which is not mlp

Open linhlt-it-ee opened this issue 2 years ago • 1 comments

Dear authors, python main.py
--data_name CIFAR100 --data_dir dataset --log_dir log_output
--n_init_lb 1000 --n_query 1000 --n_round 10 --learning_rate 0.001 --n_epoch 50 --model vit_small
--strategy All --alpha_opt --alpha_closed_form_approx --alpha_cap 0.2 --pretrained_model Is that ok to keep alpha_cap 0.2 for vit_small model? Idk why the progam died in the middle and accuracy is so weird. It is better if the authors can provide script to run all datasets with different settings. "0.2529","0.0" "0.302","0.0010182857513427734" "0.3343","0.0009670257568359375" "0.3441","0.0009663105010986328" "0.3327","0.0010838508605957031"

linhlt-it-ee avatar Jan 05 '23 09:01 linhlt-it-ee

Please set --stratey=AlphaMixSampling so that you run our AL approach (i.e. ALFA-Mix). When using "All", the code tries to run all the baselines (except ALFA-Mix) sequentially, starting from Random Sampling. In the experiments reported in the paper, we used 0.2 for alpha_cap.

AminParvaneh avatar Apr 02 '23 11:04 AminParvaneh