Jing
Jing
I found the loss is different from the loss said in BYOL paper which should be a L2 loss and I did't find explanation... The loss in this repo is...
对于ltp4.2.0,在使用add_words之后特殊情况下的报错。直觉上是添加了word(如‘abc')之后,输入类似”xabc“这样的词会出现这样的问题: 输入 ``` ltp.add_words(['800000股']) ltp.pipeline(['3800000股'], tasks=["cws", "pos"]) ``` 报错信息为KeyError ``` Traceback (most recent call last) Cell In[57], line 1 ----> 1 ltp.pipeline(['3800000股'], tasks=["cws", "pos"]) File D:\software\anaconda\envs\EDEE\lib\site-packages\ltp\nerual.py:24, in no_grad..wrapper(*args, **kwargs) 22...