Med-ChatGLM icon indicating copy to clipboard operation
Med-ChatGLM copied to clipboard

Repo for Chinese Medical ChatGLM 基于中文医学知识的ChatGLM指令微调

Results 28 Med-ChatGLM issues
Sort by recently updated
recently updated
newest added

你好作者,想请请教一下当我运行代码时出现如下报错是什么原因呢 Traceback (most recent call last): File "C:\Users\Lenovo\Desktop\Med-ChatGLM-main\Med-ChatGLM-main\infer.py", line 4, in tokenizer = AutoTokenizer.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\python\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 794, in from_pretrained config = AutoConfig.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\python\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 1138, in...

您好,请问: 1.知识库的数据集构建代码发布了吗? 2.请问构建知识库的具体代码可以发布吗? 3.使用uie和cMeKG做医学文献抽取哪个更好呢? 期待您的回答!!!

可以询问一下怎么评估和对比模型的性能,是否有评价标准

![image](https://github.com/SCIR-HI/Med-ChatGLM/assets/11847871/3f465785-73cc-4aff-9c2b-71681252e7b0) ![image](https://github.com/SCIR-HI/Med-ChatGLM/assets/11847871/e2861953-8636-4934-96c4-dc9daaf793f7)

home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py:1201: UserWarning: You have modified the pretrained model configuration to control generation. This is a deprecated strategy to control generation and will be removed soon, in a future version. Please...

i can only use gpu,and ran the code below import torch from transformers import AutoTokenizer, AutoModel from modeling_chatglm import ChatGLMForConditionalGeneration tokenizer = AutoTokenizer.from_pretrained( "./model", trust_remote_code=True) model = ChatGLMForConditionalGeneration.from_pretrained( "./model").half() while...

怎么给像本草一样批量喂数据给模型,让模型生成结果。

[INFO|tokenization_utils_base.py:1800] 2023-08-02 20:41:21,905 >> loading file tokenizer.model [INFO|tokenization_utils_base.py:1800] 2023-08-02 20:41:21,905 >> loading file added_tokens.json [INFO|tokenization_utils_base.py:1800] 2023-08-02 20:41:21,905 >> loading file special_tokens_map.json [INFO|tokenization_utils_base.py:1800] 2023-08-02 20:41:21,906 >> loading file tokenizer_config.json [WARNING|modeling_utils.py:2092] 2023-08-02...

尊敬的Med-ChatGLM应用开发者,我是 InternLM 社区开发者&志愿者尖米, 大佬开源的工作对我的启发很大,希望可以探讨使用 InternLM 实现Med-ChatGLM的可能性和实现路径,我的微信是mzm312,希望可以取得联系进行更深度的交流