transformers
transformers copied to clipboard
TypeError: type_as() missing 1 required positional arguments: "other"
I set my code as below and keep getting issues in the transformers model. Could anyone help fix this error? If there is any specific format of inputs I need to use, please let me know.
config = TransfoXLConfig() model = TransfoXLModel.from_pretrained('transfo-xl-wt103', config=config)
out = model(inputs_embeds = obs_encoding.unsqueeze(1), output_hidden_states = True, mems = memory)
----> 8 out = model(inputs_embeds = obs_encoding.unsqueeze(1), output_hidden_states = True, mems = memory) 9 10 # memory = out.mems
2 frames /usr/local/lib/python3.10/dist-packages/transformers/models/deprecated/transfo_xl/modeling_transfo_xl.py in forward(self, input_ids, mems, head_mask, inputs_embeds, output_attentions, output_hidden_states, return_dict) 943 attentions = [] if output_attentions else None 944 if self.attn_type == 0: # default --> 945 pos_seq = torch.arange(klen - 1, -1, -1.0, device=word_emb.device, dtype=torch.int64).type_as( 946 dtype=word_emb.dtype 947 )
TypeError: type_as() missing 1 required positional arguments: "other"
Hi @dyan-lee, thanks for raising an issue!
Could you provide:
- The running environment: run
transformers-cli env
in the terminal and copy-paste the output - More information so that the code example can be run and reproduced. Specifically, what are the types and shapes of
memory
andobs_encoding
?
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
I'm also running into this issue and can reproduce it in an isolated way, e.g.:
some_tensor = torch.tensor([1,2,3,4])
print(f"type {type(some_tensor)}, dtype {some_tensor.dtype}")
pos_seq = torch.arange(6 - 1, -1, -1.0, device=some_tensor.device, dtype=torch.int64)
# pos_seq_with_err = pos_seq.type_as(dtype=some_tensor.dtype)
pos_without_err = pos_seq.type_as(some_tensor)
outputs: type <class 'torch.Tensor'>, dtype torch.int64
But if you uncomment pos_seq_with_err, it shows same error:
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[48], line 4
2 print(f"type {type(some_tensor)}, dtype {some_tensor.dtype}")
3 pos_seq = torch.arange(6 - 1, -1, -1.0, device=some_tensor.device, dtype=torch.int64)
----> 4 pos_seq_with_err = pos_seq.type_as(dtype=some_tensor.dtype)
5 # pos_without_err = pos_seq.type_as(some_tensor)
TypeError: type_as() missing 1 required positional arguments: "other"
I think type_as is just being used incorrectly per https://pytorch.org/docs/stable/generated/torch.Tensor.type_as.html
I fixed it in my fork here and it seems to work now https://github.com/ellemcfarlane/transformers/commit/29739e690bb3448e68914173fea56c1974fba1cb
Hi @ellemcfarlane, awesome detective work - thanks for sharing!
As the model transfo-xl
is deprecated, it isn't actively maintained and we won't be making any future changes to the modeling code. However, anyone who still wishes to use the model can apply your fix to their copy of the code 🤗
@amyeroberts
it isn't actively maintained and we won't be making any future changes to the modeling code
That's what I figured, so no worries :)
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.