cmc-csci181 icon indicating copy to clipboard operation
cmc-csci181 copied to clipboard

char level highlighting seem to be slightly different but word highlighting is correct

Open AlexKer opened this issue 5 years ago • 5 comments
trafficstars

Hi! I'm encountering the issue where explaining on the character level does not give me the correct highlight for the example sentence. However, the on the word level it seems to be highlight the shades correctly. Could it be an issue with my for loop? I'm starting at index 1 and ending at the second last index to avoid the BOL and EOL characters, and store the l2norm in scores[i-1] because of the 0-indexing for that vector.

def explain(line,filename,explain_type):

    scores = torch.zeros([len(line)])
    line_tensor = str_to_tensor([line],args.input_length)
    #print(line_tensor.size())=torch.Size([# of chars in sentence, 1, 76])
    output_class,_ = model(line_tensor)
    probs = softmax(output_class)

    if explain_type=='char':
        for i in range(1,len(line)-1): # could it be here?
            line_tensor_copy = line_tensor.clone().detach()
            line_tensor_copy[i,0,:] = torch.zeros([len(vocabulary)])
            char_output,_ = model(line_tensor_copy)
            char_probs = softmax(char_output)
            scores[i-1] = torch.norm(probs-char_probs)

    elif explain_type=='word':
        word_lengths = [len(word) for word in line.split()]
        cnt = 1

        for word_length in word_lengths:
            line_tensor_copy = line_tensor.clone().detach()
            line_tensor_copy[cnt:cnt+word_length,0,:] = torch.zeros([word_length,len(vocabulary)])
            word_output,_ = model(line_tensor_copy)
            word_probs = softmax(word_output)
            word_l2_norm = torch.norm(probs-word_probs)

            for i in range(cnt,cnt+word_length):
                scores[i-1]=word_l2_norm
            cnt = cnt+word_length+1 #space

    line2img(line,scores,filename)

line0002 char line0002 word

AlexKer avatar Apr 24 '20 21:04 AlexKer

Hi Alex, I had a similar issue and fixed it by changing range(1,len(line)-1) to range(1,len(line)+1) :)

nikpap123 avatar Apr 24 '20 22:04 nikpap123

I changed that line and it was the same results. I think the tiny variations may be fine in this case.

AlexKer avatar Apr 25 '20 01:04 AlexKer

I am using the squared L2 norm to measure distance, and it appears you are using the standard L2 norm. Squaring the values results in large separation between the values, and so more emphasis is placed on the dark green regions.

I suspect changing your line

            word_l2_norm = torch.norm(probs-word_probs)

to

            word_l2_norm = torch.norm(probs-word_probs)**2

will give values that look more like mine.

mikeizbicki avatar Apr 26 '20 03:04 mikeizbicki

@mikeizbicki thank you, the change indeed helped the contrast the highlighting more. In practice, is the squared L2 norm used more often, or the standard L2 norm will have the same effects in explainability?

AlexKer avatar Apr 26 '20 19:04 AlexKer

Both of those are widely used, and there are literally hundreds more measures that people use. Any function that satisfies the properties of either a divergence or a metric can be used.

mikeizbicki avatar Apr 26 '20 21:04 mikeizbicki