EMNLP2018-JMEE icon indicating copy to clipboard operation
EMNLP2018-JMEE copied to clipboard

embeddingMatrix is never passed when building model

Open airkid opened this issue 6 years ago • 3 comments

When building model, it seems that the loaded glove embedding is not used.
I think thats one of the reason that I can't reproduct the experiment result.
https://github.com/lx865712528/JMEE/blob/494451d5852ba724d273ee6f97602c60a5517446/enet/models/ee.py#L20
https://github.com/lx865712528/JMEE/blob/494451d5852ba724d273ee6f97602c60a5517446/enet/run/ee/runner.py#L55

airkid avatar Feb 23 '19 14:02 airkid

Hi @airkid, I noticed the same problem, so I write code below to pass pretrained word embedding:

def load_model(self, fine_tune, embeddingMatrix=None):
    if fine_tune is None:
        return EDModel(self.a.hps, self.get_device(), embeddingMatrix=embeddingMatrix)

mikelkl avatar Mar 07 '19 07:03 mikelkl

Hi @ycc1028, the paperr mentions that pre-trained Glove word embedding is used

mikelkl avatar Mar 07 '19 10:03 mikelkl

Hi @ycc1028 , after this modification it still can not reach the performance cause there are still another evaluate problem #6

airkid avatar Mar 07 '19 10:03 airkid