Torch-Pruning
Torch-Pruning copied to clipboard
after load a pruned model,how to test the pruned model?
load a pruned model model = torch.load('model.pth') # no load_state_dict
Hi @henbucuoshanghai, The inference (for evaluation or testing) is pretty straightforward I guess, you could check this link or follow my simple step below.
Simply do this should work for your model.
# load a pruned model
model = torch.load('model.pth') # no load_state_dict
model.eval()
device = torch.device('cuda:0') # torch.device('cuda') for Windows
model.to(device)
# model.cuda() also works in Windows
input = torch.rand((1,3,224,224)) # or your desired input
input.to(device)
# input.cuda() also works in Windows
with torch.no_grad():
out = model(input)
Voila, that's it!