BERT-NER
BERT-NER copied to clipboard
Model training does not work on CPU
I have cloned code from dev branch and executing following command to fine-tune model on CPU:
python run_ner.py --cache_dir=path_to_cache --data_dir=path_to_data --bert_model=bert-base-uncased --task_name=ner --output_dir=path_to_output --no_cuda --do_train --do_eval --warmup_proportion=0.1
But I am facing the following error:
Traceback (most recent call last):
File "run_ner.py", line 611, in
I am not getting when I am passing CPU flag, why is it expecting a tensor to be on GPU?
--no_cuda
has an error with the NER task, because the device
can still be set to GPU here:
class Ner(BertForTokenClassification):
def forward(self, input_ids,
token_type_ids=None,
attention_mask=None,
labels=None,
valid_ids=None,
attention_mask_label=None):
# ... skipping to line 47
valid_output = torch.zeros(batch_size,
max_len,
feat_dim,
dtype=torch.float32,
device='gpu')
I changed the default device
arg to cpu
when I wasn't using CUDA and everything worked as expected.