Godlike_Yue
Godlike_Yue
您好,我想在pytorch_version版本基础上改用Robert-tiny或者Albert-tiny之类的小型bert模型进行实验,请问改动大吗?
windows环境,jiebe_fast == 0.53 本地用tracemalloc输出内存泄漏代码 ![image](https://user-images.githubusercontent.com/23072890/171126150-22fb9c5d-c3f0-4ace-b6a1-4870a1c47828.png)
pip install ltp==4.1.5.post2 win10 引用ltp 报错 ![image](https://user-images.githubusercontent.com/23072890/129569219-7766dd0f-b990-4b8c-b7e5-caac255f5489.png)
我通过transformers使用roberta_chinese_pair_tiny,提示以下warning 1. You are using a model of type roberta to instantiate a model of type bert. This is not supported for all configurations of models and can yield errors....
在snli数据集上可以复现实验结果,ACC可以达到89.2 {'acc': 0.8926099348534202, 'loss': 0.3331541208659901, 'score': 0.8926099348534202, 'updates': 136400}
Please provide the version of transformers ?