sequicity icon indicating copy to clipboard operation
sequicity copied to clipboard

About CopyNet

Open gmftbyGMFTBY opened this issue 5 years ago • 1 comments

Thanks for your open-source code. I have a question about your stable version of the copynet. I find that your version copynet considers about the generation probability of the source tokens, but in the origin paper of CopyNet, they also added some other mechanisms, such as selective read, but I have not found it in your code (Is this version copynet good enough for the Sequicity ?), can you tell me the reason? Thank you very much!

gmftbyGMFTBY avatar Mar 11 '19 04:03 gmftbyGMFTBY

And also in tsd_net.py

    def supervised_loss(self, pz_proba, pm_dec_proba, z_input, m_input):
        pz_proba, pm_dec_proba = pz_proba[:, :, :cfg.vocab_size].contiguous(), pm_dec_proba[:, :,
                                                                               :cfg.vocab_size].contiguous()
        pr_loss = self.pr_loss(pz_proba.view(-1, pz_proba.size(2)), z_input.view(-1))
        m_loss = self.dec_loss(pm_dec_proba.view(-1, pm_dec_proba.size(2)), m_input.view(-1))

        loss = pr_loss + m_loss
        return loss, pr_loss, m_loss

In this function, the pz_proba is the tensor with shape [T, B, V_aug] (in camrest626 dataset V_aug is 8xx [800 + seq_len]), which is generated by copynet. But when you calculate the NLLoss,you only use the vocabulary size which is 800 (In this case, I think that the information about the source sequence is lost). In other words, I don't think it's the CopyNet.

I am confused about it, can you explan it ?

thank you very much !

gmftbyGMFTBY avatar Mar 11 '19 10:03 gmftbyGMFTBY