DeepKE
DeepKE copied to clipboard
How to solve the problem that the modle file in re cann't be decoded to utf-8?
Describe the question
A clear and concise description of what the question is.
Environment (please complete the following information):
- OS: [e.g. mac / window]
- Python Version [e.g. 3.6]
Screenshots
If applicable, add screenshots to help explain your problem.
Additional context
Add any other context about the problem here.
According to the figure, cfg.lm_file
is the path of a HuggingFace-style model, which is not suitable. Did you use your own model? If so, maybe your model file has something wrong.
my model is got from the website given below, which is based on BERT-wwm, Chinese. website: https://gitcode.com/zjunlp/DeepKE/blob/main/README_CNSCHEMA_CN.md
Hello, we have verified and found that this issue does exist. We will fix it as soon as possible, thanks!
thank you, if the issue was solved, you could tell me as soon as possible.
May I ask if you have modified the lm_file in lm.yaml? It is the fp in predict.yaml that needs to be modified not the llm.yaml file, you can check it.
hi buddy, we are currently on holiday for the Chinese New Year, sorry for the late reply.
It seems you might have used the wrong configuration file. Has your issue been resolved?
Hi, have you solved the issue? may i close this issue?
我还有问题,请问哪里需要修改?我的lm.yaml和predict.yaml中的路径都是一样的
第一张是lm.yaml;第二张是predict.yaml
您好,lm.yaml
文件是无需修改的,您只需要将其恢复原样即可lm_file: 'bert-base-chinese'
那这个问题是什么呢?
推测是网络原因造成的。您可以尝试部署网络代理或者设置huggingface镜像网站,或者选择直接从huggingface中下载bert-base-chinese并将lm_file
改为本地存储目录。
请问网络代理不能解决的话只能去hugging face 下载了吗?
请问您在国内吗?如果在中国地区可以使用下述添加镜像网站的方法下载模型:
pip install -U huggingface_hub
export HF_ENDPOINT=https://hf-mirror.com
huggingface-cli download --resume-download --local-dir-use-symlinks False google-bert/bert-base-chinese --local-dir bert-base-chinese
在国内
---- 回复的原邮件 ---- | 发件人 | @.> | | 日期 | 2024年02月23日 16:00 | | 收件人 | @.> | | 抄送至 | @.>@.> | | 主题 | Re: [zjunlp/DeepKE] How to solve the problem that the modle file in re cann't be decoded to utf-8? (Issue #410) |
请问您在国内吗?如果在中国地区可以使用下述添加镜像网站的方法下载模型:
pip install -U huggingface_hub export HF_ENDPOINT=https://hf-mirror.com huggingface-cli download --resume-download --local-dir-use-symlinks False google-bert/bert-base-chinese --local-dir bert-base-chinese
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
请问问题解决了吗?
请问哪个是bert-base-chinese的文件地址?
您只需要执行上述几行代码就可以下载好模型了,用这个方法无需修改lm_file
请问您在国内吗?如果在中国地区可以使用下述添加镜像网站的方法下载模型:
pip install -U huggingface_hub export HF_ENDPOINT=https://hf-mirror.com huggingface-cli download --resume-download --local-dir-use-symlinks False google-bert/bert-base-chinese --local-dir bert-base-chinese
您可以用上述代码下载模型,请问问题解决了吗
下载的那个文件夹里是空的
在 2024-02-27 11:02:33,"Eric" @.***> 写道:
请问您在国内吗?如果在中国地区可以使用下述添加镜像网站的方法下载模型:
pip install -U huggingface_hub export HF_ENDPOINT=https://hf-mirror.com huggingface-cli download --resume-download --local-dir-use-symlinks False google-bert/bert-base-chinese --local-dir bert-base-chinese
您可以用上述代码下载模型,请问问题解决了吗
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
您好,bert-base-chinese 模型在网上很多地方都可以直接下载到本地使用,比如这里https://github.com/ymcui/Chinese-BERT-wwm
下载的那个文件夹里是空的 在 2024-02-27 11:02:33,"Eric" @.> 写道: 请问您在国内吗?如果在中国地区可以使用下述添加镜像网站的方法下载模型: pip install -U huggingface_hub export HF_ENDPOINT=https://hf-mirror.com huggingface-cli download --resume-download --local-dir-use-symlinks False google-bert/bert-base-chinese --local-dir bert-base-chinese 您可以用上述代码下载模型,请问问题解决了吗 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.>
可能是您的网络问题
请问您有成功下载模型吗?
还是有问题啊
还是有问题啊
您好,这是自动调参的wandb的key没输入,如果您不需要使用自动调参,把这行代码注释掉即可运行, wandb使用比较方便您可以参考https://wandb.ai/site
请问为何加载半天不出结果呢?
请问您使用的是gpu吗?建议您使用linux环境gpu来运行,cpu下速度非常慢
那这个是在运行吗?
---- 回复的原邮件 ---- | 发件人 | @.> | | 日期 | 2024年03月01日 16:42 | | 收件人 | @.> | | 抄送至 | @.>@.> | | 主题 | Re: [zjunlp/DeepKE] How to solve the problem that the modle file in re cann't be decoded to utf-8? (Issue #410) |
请问您使用的是gpu吗?建议您使用linux环境gpu来运行,cpu下速度非常慢
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
请问这是哈情况啊
建议您安装一个虚拟anaconda 环境再运行,在readme有相关的说明。从报错看可能是python库版本不对,您严格按requirement.txt里面的库版本是可以正常运行的。
有些库的版本都不是最新的,所以要手动下载吗?