Said Bleik
Said Bleik
I've encountered the same error on {2.0rc2, gpu, lstm, adam}. It is also intermittent in my case. ``` while len(mb) > 0: trainer.train_minibatch(mb) mb = reader.next_minibatch(minibatch_size * avg_seq_len, input_map=input_map) ```
You can do so by sampling your examples based on some distribution. You can also do that by using a torch.utils.data.WeightedRandomSampler instead of the existing RandomSampler here: https://github.com/microsoft/nlp-recipes/blob/staging/utils_nlp/models/transformers/sequence_classification.py#L257
good catch. - merge both functions into preprocess_sequence_tokens - function returns (tokens, input_ids, input_mask, token_type_ids) - update notebook references
Which GitHub folder are you referring to?
We need to add these to the env. Ideally it should be optional: - install NCCL2 - install Open MPI - install g++ - pip install horovod
Yes, horovod needs to be installed https://github.com/horovod/horovod#install We will update the notebook with that info.