BERT-pytorch icon indicating copy to clipboard operation
BERT-pytorch copied to clipboard

# of parameter is larger than bert base instead im using specs less than bert base

Open MohamedLotfyElrefai opened this issue 6 years ago • 0 comments

my model -hs 60 -l 3 -a 3 -s 26 -b 5 -e 10 -w 4 --with_cuda True --log_freq 20 --on_memory False --lr 1e-3 --adam_weight_decay 0.0 --adam_beta1 0.9 --adam_beta2 0.999

#of parameter

Total Parameters: 229,503,472 which bert base using only

-hs 768
-a 12
-l 12

#of parameter

110M parameters

MohamedLotfyElrefai avatar Oct 14 '19 09:10 MohamedLotfyElrefai