MLOps-Basics icon indicating copy to clipboard operation
MLOps-Basics copied to clipboard

Week 0 inference does not work

Open KansaiUser opened this issue 5 months ago • 2 comments

I run the train.py script, but when I try the inference.py I got

python inference.py 
/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884
  warnings.warn(
Traceback (most recent call last):
  File "inference.py", line 34, in <module>
    print(predictor.predict(sentence))
  File "inference.py", line 19, in predict
    logits = self.model(
  File "/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
    return forward_call(*args, **kwargs)
  File "/media/sensetime/cbe421fe-1303-4821-9392-a849bfdd00e21/MyStudy/MLOps/MLOps-Basics/week_0_project_setup/model.py", line 19, in forward
    outputs = self.bert(input_ids=input_ids, attention_mask=attention_mask)
  File "/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/transformers/models/bert/modeling_bert.py", line 1077, in forward
    embedding_output = self.embeddings(
  File "/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/transformers/models/bert/modeling_bert.py", line 210, in forward
    inputs_embeds = self.word_embeddings(input_ids)
  File "/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/torch/nn/modules/sparse.py", line 164, in forward
    return F.embedding(
  File "/home/me/anaconda3/envs/MLProj38/lib/python3.8/site-packages/torch/nn/functional.py", line 2267, in embedding
    return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu! (when checking argument for argument index in method wrapper_CUDA__index_select)

Apparently the error is on

logits = self.model(
            torch.tensor([processed["input_ids"]]),
            torch.tensor([processed["attention_mask"]]),
        )

Any idea how to make the script work?

By the way the link to the posts is dead so there is no explanation to the code here

KansaiUser avatar Sep 21 '24 13:09 KansaiUser