covid-papers-browser
covid-papers-browser copied to clipboard
Add argparse parametrization for the finetuning script
Similar to what is currently available in download_model.py
, add Argparse with parameters in finetune_nli.py
for parameters:
-
model_name
, default 'models/scibert', typestr
-
batch_size
, default 64, typeint
-
model_save_path
, default 'models/scibert_nli', typestr
-
num_epochs
, default 2, typeint
-
warmup_steps
, default None, not required -
do_mean_pooling
withaction='store_true'
-
do_cls_pooling
withaction='store_true'
-
do_max_pooling
withaction='store_true'
Then:
-
Add check for only one of the pooling condition to be verified (raise an
AttributeError
if more than one is). If none is specified, we use mean pooling strategy. -
Check if the
warmup_step
parameter is set before setting it to 10% of training: if it is, keep the user-defined value.