Kai Shen(沈锴)
Kai Shen(沈锴)
Reply to all: We will have a meeting this weekend. I think the new release will come in two weeks. Really sorry for the delay.
Yes. The full support of the transformer is on schedule. The transformer encoder will release in the next coming version. And the transformer decoder will come soon (on test). @code-rex1
We are still discussing it #578. We still need 2-3 weeks to make it ready since it covers many features. @smith-co
I apologize for the late reply since we have been busy during the past two months. Do you still have the problem?
Our implementation of copy doesn't support the multi-token copy. I think you should customize it. A straightforward way is to learn a hierarchical copy: I mean first copy the node,...
I apologize for the late reply since we have been busy during the past two months. This is due to the inconsistency of the doc. The variable [data_set](https://github.com/graph4ai/graph4nlp/blob/d65401bca16d15024edb68ad5e57a774c60cf881/graph4nlp/pytorch/modules/utils/vocab_utils.py#L72) should be...
Thanks for your interest in the library. We agree that transformer is needed and it is on schedule. I have replied it in https://github.com/graph4ai/graph4nlp/issues/496#issuecomment-1207130759.
Thanks for your interest. The following are some personal insights. For Q1: Yes. The edges contain more information that can usually help the down tasks. For Q2: I think the...
I think it depends on your specific design. Adding useful edges will certainly help message passing. But if you add negative edges, I guess it harms the performance.
Sorry for the late reply. Do you still have problems with customizing the dataset?