mean-teacher
mean-teacher copied to clipboard
applying mean teacher to my own dataset
hi, i have already achieve ~94% with 4000 labels on cifar10. But for my own three classification task, i have 160k labelled data and unlabelled data. i can not get expected results( worse than using the labelled data to train directly ). is lr strategy sensitive to it? here is my setting (finetuning on mobilenet-v1 and using 4 gpus): thanks in advance
defaults = {
# Technical details
'workers': 20,
'checkpoint_epochs': 20,
'evaluation_epochs': 5,
# Data
'dataset': 'my dataset',
'train_subdir': 'train',
'eval_subdir': 'test',
# Data sampling
'base_batch_size': 100,
'base_labeled_batch_size': 50,
# Architecture
'arch': 'mnet1',
# Costs
'consistency_type': 'mse',
'consistency_rampup': 5,
'consistency': 20.0,
'logit_distance_cost': .01,
'weight_decay': 2e-4,
# Optimization
'lr_rampup': 0,
'base_lr': 0.001,
'nesterov': True,
}
Same problem here.
Did you fix the problem?
@fahad7033 i haven't done it in a long time, you can try to train more iters.
hi, i have already achieve ~94% with 4000 labels on cifar10. But for my own three classification task, i have 160k labelled data and unlabelled data. i can not get expected results( worse than using the labelled data to train directly ). is lr strategy sensitive to it? here is my setting (finetuning on mobilenet-v1 and using 4 gpus): thanks in advance
defaults = {
# Technical details 'workers': 20, 'checkpoint_epochs': 20, 'evaluation_epochs': 5, # Data 'dataset': 'my dataset', 'train_subdir': 'train', 'eval_subdir': 'test', # Data sampling 'base_batch_size': 100, 'base_labeled_batch_size': 50, # Architecture 'arch': 'mnet1', # Costs 'consistency_type': 'mse', 'consistency_rampup': 5, 'consistency': 20.0, 'logit_distance_cost': .01, 'weight_decay': 2e-4, # Optimization 'lr_rampup': 0, 'base_lr': 0.001, 'nesterov': True, }
So I wonder how to create my own datasets.......
I have the same problem about how to apply the mean teacher to my own dataset(especially for audio) and model.