BYOL
BYOL copied to clipboard
Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
I would like to ask `target_encoder = self._get_target_encoder() `here, isn’t the online encoder copied directly?(`target_encoder = copy.deepcopy(self.online_encoder) `) I don’t feel that momentum is used. Maybe because I am not...
Hi there When I train cifar10 with resnet18, does the accuracy improve if I increase the number of epochs? How much accuracy do you get if you run about 2000...
--------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) in () 1 # pre-trained model 2 resnet = models.resnet50() ----> 3 resnet.load_state_dict(torch.load(args.model_path, map_location=device)) 4 resnet = resnet.to(device) 5 /usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py in load_state_dict(self, state_dict,...
Thanks for your excellent work!I am a fresher. The following error occurred when I combined the network structure with my work: `RuntimeError: Only Tensors created explicitly by the user (graph...
I found that online_encoder already has a projector (in the NetWrapper), but why there is another online_preditor in the BYOL class. It is a bit confusing.
before pre-train I save the checkpoint which should be very weak in representation  and perform logistic regression as: python3 logistic_regression.py --model_path=./model-no-train.pt and the result is Epoch [296/300]: Loss/train: 0.4051439380645752...