scibert icon indicating copy to clipboard operation
scibert copied to clipboard

gradient_accumulation_batch_size missing in trainer

Open carbonz0 opened this issue 4 years ago • 3 comments

I found gradient_accumulation_batch_size exists in several scibert conf, such as https://github.com/allenai/scibert/blob/8562a120e6788dcbadbe05ef7fd4463dee17ee59/allennlp_config/ner.json but allennlp trainer doesn't have this param, https://github.com/allenai/allennlp/blob/master/allennlp/training/trainer.py

carbonz0 avatar Oct 24 '19 03:10 carbonz0

Yes, AllenNLP doesn't support gradient accumulation. We have it implemented in our fork of allennlp (check requirements: https://github.com/allenai/scibert/blob/master/requirements.txt)

ibeltagy avatar Oct 29 '19 20:10 ibeltagy

got it, thank you!

carbonz0 avatar Nov 01 '19 03:11 carbonz0

Hi, The installation command in the readme returned the following "allennlp version not found error". Can I ask what is the right way to install the tool and reproduce the results?

Thank you very much!

Obtaining allennlp from git+git://github.com/ibeltagy/allennlp@fp16_and_others#egg=allennlp (from -r requirements.txt (line 1)) WARNING: Discarding git+git://github.com/ibeltagy/allennlp@fp16_and_others#egg=allennlp. Command errored out with exit status 128: git rev-parse HEAD Check the logs for full command output. ERROR: Could not find a version that satisfies the requirement allennlp (unavailable) ERROR: No matching distribution found for allennlp (unavailable)

xiaoruijiang avatar Jul 16 '22 00:07 xiaoruijiang