AdaFace icon indicating copy to clipboard operation
AdaFace copied to clipboard

How to use inference on GPU? feature, _ = model(bgr_tensor_input) don't use GPU

Open martinenkoEduard opened this issue 1 year ago • 1 comments

How to use inference on GPU? feature, _ = model(bgr_tensor_input) don't use GPU

martinenkoEduard avatar Sep 17 '24 10:09 martinenkoEduard

after train your model:

def load_pretrained_model(): model = net.build_model("ir_101") statedict = torch.load(MODEL_WEIGHT, map_location=device)['state_dict'] model_statedict = {key[6:]: val for key, val in statedict.items() if key.startswith('model.')} model.load_state_dict(model_statedict) model.eval() return model.to(device)

model = load_pretrained_model() model.eval()

feature, _ = model(batch)

(batch already in gpu )

macqueen09 avatar Feb 26 '25 07:02 macqueen09