neural-backed-decision-trees icon indicating copy to clipboard operation
neural-backed-decision-trees copied to clipboard

Using softTreeLoss error

Open Muzijiajian opened this issue 5 years ago • 1 comments

Hello, I am trying to use softTreeLoss by using following codes: from nbdt.loss import SoftTreeSupLoss train_loss_fn = nn.CrossEntropyLoss().cuda() criterion = SoftTreeSupLoss(criterion=train_loss_fn, dataset='Imagenet1000', tree_supervision_weight=1.0, hierarchy='induced-efficientnet_b7b') ... for i, (input, targets) in enumerate(train_loader): targets = targets.cuda(async=True) input_var = torch.autograd.Variable(input).cuda() targets_var = torch.autograd.Variable(targets).cuda() scores = model(input_var) loss = criterion(scores, targets_var)

Then it comes the following errors: File "/gruntdata/semantic-hierarchy-master/neural-backed-decision-trees/nbdt/model.py", line 240, in forward wnid_to_outputs = self.forward_nodes(outputs) File "/gruntdata/semantic-hierarchy-master/neural-backed-decision-trees/nbdt/model.py", line 101, in forward_nodes return self.get_all_node_outputs(outputs, self.nodes) File "/gruntdata/semantic-hierarchy-master/neural-backed-decision-trees/nbdt/model.py", line 90, in get_all_node_outputs node_logits = cls.get_node_logits(outputs, node) File "/gruntdata/semantic-hierarchy-master/neural-backed-decision-trees/nbdt/model.py", line 79, in get_node_logits for new_label in range(node.num_classes) File "/gruntdata/semantic-hierarchy-master/neural-backed-decision-trees/nbdt/model.py", line 79, in for new_label in range(node.num_classes) AttributeError: 'Tensor' object has no attribute 'T'

Muzijiajian avatar Nov 14 '20 11:11 Muzijiajian

@Muzijiajian Hm, are you on PyTorch 1.4? https://github.com/alvinwan/neural-backed-decision-trees/blob/master/requirements.txt#L2. Your code looks right, and tensors should definitely support .T for transpose.

alvinwan avatar Nov 17 '20 21:11 alvinwan