Chinese_models_for_SpaCy icon indicating copy to clipboard operation
Chinese_models_for_SpaCy copied to clipboard

SpaCy 中文模型 | Models for SpaCy that support Chinese

Results 10 Chinese_models_for_SpaCy issues
Sort by recently updated
recently updated
newest added

where can I find this file: ./spacy-dev-resources/training/plain_word_vector.py

已经安装了spacy2.1.9, 执行spacy link zh_core_web_sm zh:显示‘spacy’不是内部或外部命令 执行python -m spacy link zh_core_web_sm zh:显示没有足够权限执行此操作

使用spacy2.3版本和2.2.4加载模型时, ValueError: Unexpected character in found when decoding object value

from spacy import displacy from tabulate import tabulate import zh_core_web_sm nlp = zh_core_web_sm.load() when I try this, it gives me the following error: ValueError: Can't read file: D:\Anaconda\lib\site-packages\zh_core_web_sm\zh_core_web_sm-0.1.0\tokenizer\cfg And I...

when I run the code, ''' import spacy nlp = spacy.load('zh_core_web_sm') ''' There are some errors: ValueError: could not broadcast input array from shape (128) into shape (96)

ValueError: could not broadcast input array from shape (128) into shape (96) 如下图所示的报错: ![image](https://user-images.githubusercontent.com/36957508/81768889-ba8b6600-950e-11ea-9223-126c7d01ee6f.png)

/lib/python3.7/site-packages/thinc/neural/util.py", line 145, in copy_array dst[:] = src ValueError: could not broadcast input array from shape (128) into shape (96)

Hi! Thank you for your utmost efforts to bulid zh_core_web_sm zh model。 The "sm" means the vectors are excluded. Is it correct?

not a bug report per se I'm wondering how spacy/chinese models compares with the stanza project? Stanza already provides chinese support with many features https://stanfordnlp.github.io/stanza/models.html that has a chinese (simplified)...