ConSERT icon indicating copy to clipboard operation
ConSERT copied to clipboard

Code for our ACL 2021 paper - ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer

Results 26 ConSERT issues
Sort by recently updated
recently updated
newest added

您好,想请问一下, 1.看到代码中只有 unsup-consert-base.sh 使用了 no_dropout 参数,而其它没有将BERT自带的dropout设置为0,这是为什么呢? 2.在禁用了BERT的dropout的情况下,是原句子和数据增强后句子都也不使用dropout,还是说只是数据增强后的句子不使用?

I've been running into this issue when I run `bash scripts/unsup-consert-base.sh` ``` Traceback (most recent call last): File "main.py", line 327, in main(args) File "main.py", line 185, in main word_embedding_model...

具体报错是: File "/data2/work2/chenzhihao/NLP/nlp/sentence_transformers/SentenceTransformer.py", line 594, in fit loss_value = loss_model(features, labels) File "/root/anaconda3/envs/NLP_py39/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1110, in _call_impl return forward_call(*input, **kwargs) File "/data2/work2/chenzhihao/NLP/nlp/sentence_transformers/losses/AdvCLSoftmaxLoss.py", line 775, in forward rep_a_view1 = self._data_aug(sentence_feature_a, self.data_augmentation_strategy_final_1,...

作者您好,感谢分享! 请问您有尝试过将simcse与您论文中多种数据增强策略结合吗?或者说您觉得这种方式对结果提升有价值吗,谢谢~

Cannot load the model. code from sentence_transformers import SentenceTransformer model = SentenceTransformer("../../models/consbert/unsup-consert-base-atec_ccks") # the model path Error message Traceback (most recent call last): File "/home/qhd/PythonProjects/GraduationProject/code/preprocess_unlabeled_second/sentence-bert.py", line 16, in model =...

感谢您非常出色的工作。有个小问题想问一下,看过代码后发现loss损失函数的实现和nt-xent损失函数的公式不太一致?

您好,请问Chinese STS Tasks 使用的评测指标也是和英文Task相同的斯皮尔曼相关系数吗

你好,请问怎么进行中文数据集的有监督训练?输入命令是怎么样的?

您好。该代码是单卡训练代码,尝试改成多卡训练,用torch.nn.DataParallel对model包一下,跑代码会报错“dataparallel' object has no attribute” ?训练过程中把用到的model 都改为model.module,不再报错,可是会导致只用gpu_id训练。 请问实验过程中,多卡训练,您是怎么实现的呢?期待您的回复,感谢