Eric Harper
Eric Harper
@ekmb could you look at this or assign it to someone who can?
https://github.com/NVIDIA/NeMo/pull/4410
This is the PTL default: https://github.com/PyTorchLightning/pytorch-lightning/blob/9e61de2063724ab6ff9cde75cba1a59d10ee5208/pytorch_lightning/plugins/training_type/ddp.py#L77 To use multiple nodes with NLPDDPlugin, you can do this: https://github.com/NVIDIA/NeMo/blob/1a57cec1ddd703c45e0f7820046eb14c2f4a5883/examples/nlp/machine_translation/enc_dec_nmt.py#L114
Hi, it looks like you are using an old version of NeMo. Could you try pretraining BERT with our latest release 1.10.0, and using our new NeMo Megatron BERT training...
I'm having trouble reproducing this. I can run the notebook on my workstation. Also, this model is not currently on HuggingFace. It's hosted on NGC: ``` python PretrainedModelInfo( pretrained_model_name=biomegatron345m_biovocab_30k_cased, description=Megatron...
@stevehuang52, sorry for the delay in reviewing the PR. There's currently a lot of design questions around transformers that we are discussing. Have you tried using the NeMo Megatron transformer...
I'm having this issue as well. Even after checking out the pull request, the file is still in read-only mode and can't be edited.
jenkins
the tests are triggered, only need to add the "Run CICD" label