BackendCompilerFailed
Running modernbert within transformer Trainer class, with torch_compile=False, I get the following error:
raise AssertionError(
torch._dynamo.exc.BackendCompilerFailed: backend='inductor' raised:
AssertionError: Please convert all Tensors to FakeTensors first or instantiate FakeTensorMode with 'allow_non_fake_inputs'. Found in aten.embedding.default(tensor([...], device='cuda:0', size=(50368, 768), grad_fn=<BroadcastBackward>), FakeTensor(..., device='cuda:0', size=(1, s1), dtype=torch.int64), 50283)
torch version: '2.4.1+cu121' transformers version: '4.48.3'
I am not sure what the problem is here.
I'm running in to same issue on Pytorch: 2.4.1+cu121' and transformers 4.49.0
@farrokhsiar Any thoughts?? Best, LB
I switched to another encoder, cause I couldn't get the problem solved.
Hello,
Could you please share your boilerplate?
I ran a lot of ST/PyLate training which is wrapping transformers trainer and there is not this issue, but I think I recall encountering it at times.
Are you compiling the model explicitely? Like calling model = torch.compile(model)?
Edit: it seems that upgrading PyTorch to 2.5.1 solves the issue.