Swin-Transformer
Swin-Transformer copied to clipboard
problem inference with onnx model
after I training swinT classification model with my own dataset (10 classes),
I transfer it to onnx model using torch.onnx.export tool, codes below:
import torch
from models import build_model
from main import parse_option
if __name__ == '__main__':
args, config = parse_option()
model = build_model(config)
checkpoint = torch.load('./output/swin_tiny_patch4_window7_224/default/ckpt_epoch_60.pth', map_location='cpu')
print('start eval')
model.eval()
dummy_input = torch.randn(1, 3, 224, 224, device='cpu')
input_names = ['input']
output_names = ['output']
print('start export')
torch.onnx.export(
model, dummy_input, './output/swin_tiny_patch4_window7_224/default/swin_tiny_epoch_60.onnx', verbose=True,
opset_version=11, input_names=input_names, output_names=output_names
)
but when testing this onnx on training dataset, the result is quit different from the pth, which almost to be one class.
@Handsome-cp Hi, I met same question with you , when i export the onnx model like you,but the result is different from the pth , Have you found a solution ?