DeepHyperX icon indicating copy to clipboard operation
DeepHyperX copied to clipboard

Why ignored label appeared in prediction map?

Open darrenzhang1007 opened this issue 6 years ago • 14 comments

Sorry, I`m bothering you again.

I meeted a problem again: 1. I trained some model and the result is bad 2. In the final Confusion matrix ,there appeared a lot of elements in Undefined column.

Confusion matrix : [[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0] [ 0 19 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0] [ 0 0 568 0 0 0 0 0 0 0 0 1 2 0 0 0 0] [ 66 0 1 265 0 0 0 0 0 0 0 0 0 0 0 0 0] [ 5 0 0 0 90 0 0 0 0 0 0 0 0 0 0 0 0] [ 27 0 0 0 0 166 0 0 0 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 292 0 0 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 11 0 0 0 0 0 0 0 0 0] [ 11 0 0 0 0 0 0 0 180 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0 0 8 0 0 0 0 0 0 0] [ 31 0 0 0 0 0 0 0 0 0 358 0 0 0 0 0 0] [ 36 0 0 0 0 0 0 0 0 0 0 946 0 0 0 0 0] [ 14 0 0 0 0 0 0 0 0 0 0 0 223 0 0 0 0] [ 0 0 0 0 0 0 0 0 0 0 0 0 0 82 0 0 0] [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 506 0 0] [ 50 0 0 0 0 0 0 0 0 0 0 0 0 0 0 105 0] [ 0 0 0 0 0 0 0 0 0 0 0 0 2 0 0 0 35]]---

I feel this is not reasonable.I think that it makes the model to influence the final bad result

Is there any way to solve this problem?

@nshaud

darrenzhang1007 avatar Oct 28 '19 13:10 darrenzhang1007

@DarrenmondZhang This is indeed troublesome. You should not have any prediction in the undefined column. On what dataset is this?

nshaud avatar Nov 05 '19 16:11 nshaud

@DarrenmondZhang This is indeed troublesome. You should not have any prediction in the undefined column. On what dataset is this?

My dataset is IP PU . This problem exists on any dataset

This problem is common. There are many people who have this problem.

https://github.com/eecn/Hyperspectral-Classification/issues/5 https://gitlab.inria.fr/naudeber/DeepHyperX/issues/1

For above issues links, Everyone is waiting for a solution.

For this issues, I have no idea to solve this problem. o(╥﹏╥)o

@nshaud Kneel down and ask god for a solution (๑*◡*๑)

darrenzhang1007 avatar Nov 11 '19 08:11 darrenzhang1007

I'm looking into it.

nshaud avatar Nov 18 '19 10:11 nshaud

In my opinion,at main.py line 170 N_CLASSES = len(LABEL_VALUES) which include the undefined label . at model.py line 33 n_classes = kwargs['n_classes'] so our model try to learn actual classes+ 1categories. Take PU data as an example,the target is in [1,9],but the prediction is in [0,9].That why the confusion matrix looks like that. But ,in the get_model function we set weights[torch.LongTensor(kwargs['ignored_labels'])] = 0. which pass to 'criterion' and it will not have influence on loss.

eecn avatar Nov 19 '19 14:11 eecn

@eecn You are right, the ignored labels (0) are taken into account in the final class count (9 classes + 1 undefined = 10 classes) but the weight in the loss is set to 0, so it should not be learnt.

I am unsure how the undefined labels can leak into inference.

nshaud avatar Nov 25 '19 14:11 nshaud

I suspect this is somehow linked to the problem.

nshaud avatar Nov 25 '19 15:11 nshaud

image image @eecn Hello, both. I am running those codes with my own data, I think those predictions fall into ignored labels because of 'patching' in the models. In the first pic, on the left, the prediction should be pink colour label filled the region, however the pixels on left / right edges are predicted as black (ignored label [0]), because the edge pixels don't have enough pixels around them to make a proper patch, thus the models classifiy those as ignored labels. The models without patch_size parameters such as SVM or SGD don't have such kind of problems. I assue add paddings around the image or take out the edges for testing gt will solve the problem.

YiwanChen avatar Feb 26 '20 02:02 YiwanChen

@YiwanChen I see, thanks. You might be right. I do not have a lot of time to debug this but I'll try to fix this sometime.

nshaud avatar Feb 28 '20 14:02 nshaud

@YiwanChen I see, thanks. You might be right. I do not have a lot of time to debug this but I'll try to fix this sometime.

Thanks. I tried with padding on four edges of my data. It does solve the problem of prediction in ignored class.

YiwanChen avatar Feb 28 '20 22:02 YiwanChen

@YiwanChen I see, thanks. You might be right. I do not have a lot of time to debug this but I'll try to fix this sometime.

Thanks. I tried with padding on four edges of my data. It does solve the problem of prediction in ignored class.

hello, can you show how to change the padding mode in detail? I'm bothering by the same problem for days.

yinlotus avatar Mar 27 '20 17:03 yinlotus

Hello, the problem lies on the variable 'probs' located at the 'test' func of the model.py. This variable initializes to zero, then is allocated to other values during the inference stage. However, the marginal parts remain as zero because the 'sliding_window' func actually ignores those marginal pixels. I am interested in fixing this bug, and I am planing to share new scripts in the future. -)

mengxue-rs avatar Nov 12 '20 13:11 mengxue-rs

Here's a solution that maybe we can solve this problem. first, add this "img = np.pad(img, ((patch_size // 2, patch_size // 2 + 1), (patch_size // 2, patch_size // 2 + 1), (0, 0)), 'constant')" after "probs = np.zeros(img.shape[:2] + (n_classes,))" in the 'test' func of the model.py then, change the "probs[x + w // 2, y + h // 2] += out" to "probs[x, y] += out" in the 'test' func of the model.py last, change the "Compute global accuracy" like this
# Compute global accuracy total = np.sum(cm[1:,:]) accuracy = sum([cm[x][x] for x in range(1,len(cm))]) accuracy *= 100 / float(total)

But,because the padding value is 0, the classification accuracy of image edge is not very high. Maybe you could try a different way of padding the edges.

HopePersist avatar Nov 19 '20 07:11 HopePersist

I have solved this issue. More details see https://github.com/nshaud/DeepHyperX/commit/7bec2ed07e56ef50768d80950597a3eb64e3bc6f

mengxue-rs avatar Nov 20 '20 07:11 mengxue-rs

Yes, thanks to @snowzm patch I think this will be fixed soon. We just have to perform padding before processing the image.

nshaud avatar Nov 20 '20 11:11 nshaud