flutter_pytorch_mobile
flutter_pytorch_mobile copied to clipboard
Error When Loading the Model
OS: Windows 11 Package: Anaconda pytorch_mobile: ^0.2.1 PyTorch : 1.8.1
I'm getting this error when trying to load the model:
E/PyTorchMobile( 7949): assets/MobileNetV3_2.pt is not a proper model E/PyTorchMobile( 7949): com.facebook.jni.CppException: [enforce fail at inline_container.cc:222] . file >not found: archive/constants.pkl
All my save types:
quantized_model = torch.quantization.convert(mobilenet)
scripted_model = torch.jit.script(quantized_model)
torch.save(scripted_model.state_dict(), "model_output/MobileNetV3_1.pt")
torch.jit.save(scripted_model, "model_output/MobileNetV3_2.pt")
scripted_model.save("model_output/MobileNetV3_3.pt")
opt_model = optimize_for_mobile(scripted_model)
opt_model._save_for_lite_interpreter("model_output/MobileNetV3_4.ptl")
I tried all the files and got the same error. Is there a version issue? How can I save my model properly?
did you have the issue resolved?
you could try with the official .pt
If it works, it might be because the .pt file you supplied earlier was in a different (newer) torch version
I solved it by
# model is loaded into cpu
model.eval()
quantized_model = torch.quantization.convert(model)
scripted_model = torch.jit.script(quantized_model)
opt_model = optimize_for_mobile(scripted_model)
opt_model.save('model.pt')
both load model and predict runs well for me and here is my platform info OS: Windows 10 Device: Android 10 PyTorch: 1.13.0 pytorch_mobile: ^0.2.2