covid-papers-browser icon indicating copy to clipboard operation
covid-papers-browser copied to clipboard

Add argparse parametrization for the finetuning script

Open gsarti opened this issue 4 years ago • 0 comments

Similar to what is currently available in download_model.py, add Argparse with parameters in finetune_nli.py for parameters:

  • model_name, default 'models/scibert', type str

  • batch_size, default 64, type int

  • model_save_path, default 'models/scibert_nli', type str

  • num_epochs, default 2, type int

  • warmup_steps, default None, not required

  • do_mean_pooling with action='store_true'

  • do_cls_pooling with action='store_true'

  • do_max_pooling with action='store_true'

Then:

  • Add check for only one of the pooling condition to be verified (raise an AttributeError if more than one is). If none is specified, we use mean pooling strategy.

  • Check if the warmup_step parameter is set before setting it to 10% of training: if it is, keep the user-defined value.

gsarti avatar Mar 25 '20 09:03 gsarti