bert-event-extraction icon indicating copy to clipboard operation
bert-event-extraction copied to clipboard

head_indexes_2d是干什么用的

Open Hanlard opened this issue 5 years ago • 2 comments

x是[batch_size,SEQ_LEN,768]的bert表达 有一句代码: for i in range(batch_size): x[i] = torch.index_select(x[i], 0, head_indexes_2d[i]) 请问这是在做什么?

Hanlard avatar Jan 08 '20 09:01 Hanlard

The words are tokenized into tokens using BERT's wordpiece tokenizer. So some words will be split into more than one token, eg: lamb -> 'la', '##mb'. head_indexes_2d captures the 'head' tokens, in other words, it captures the index of 'la' only for processing for the downstream task such as TokenClassification (eg: POS tagging, NER).

x[i] = torch.index_select(x[i], 0, head_indexes_2d[i]) is used to select the index of only the head words and ignoring the subsequent 'chunks' if the word is split into more than one token.

This is how i interpret the codes. Please do correct me if I get this wrong, anyone? :D

meisin avatar Jul 12 '20 07:07 meisin

每个word 对应 token 的index,因为只保留word第一个token进行训练。别的直接扔了

scarydemon2 avatar Nov 22 '21 01:11 scarydemon2