MOSS
MOSS copied to clipboard
An open-source tool-augmented conversational language model from Fudan University
vocab.json中有 106029个 Token,但是模型最终生成的logit向量的维度的 107008,为什么不一致呢?这样就会出现有些 token无法解码吧?
README.md表示“目前triton仅支持Linux及WSL,暂不支持Windows及Mac OS,请等待后续更新。” 但是Windows及Mac OS可以通过[https://github.com/openai/triton](https://github.com/openai/triton)手动安装 ## Install from source ``` git clone https://github.com/openai/triton.git; cd triton/python; pip install cmake; # build-time dependency pip install -e . ```
First, thank you for open souring the data. Like id=3 in zh_helpfulness or id=6 in zh_honesty, it has something like "我的创造者是复旦大学自然语言处理实验室和上海人工智能实验室". This is not good for training our own model....
欢迎使用 MOSS 人工智能助手!输入内容即可进行对话。输入 clear 以清空对话历史,输入 stop 以终止对话。 : 你 after query /usr/bin/ld: cannot find -lcuda: No such file or directory collect2: error: ld returned 1 exit status ╭─────────────────────────────── Traceback (most...
--- Logging error --- Traceback (most recent call last): File "/home/hhh/.conda/envs/python38/lib/python3.8/site-packages/torch/distributed/elastic/agent/server/api.py", line 723, in run result = self._invoke_run(role) File "/home/hhh/.conda/envs/python38/lib/python3.8/site-packages/torch/distributed/elastic/agent/server/api.py", line 864, in _invoke_run time.sleep(monitor_interval) File "/home/hhh/.conda/envs/python38/lib/python3.8/site-packages/torch/distributed/elastic/multiprocessing/api.py", line 62, in...
按照步骤,整体如下: >>> from transformers import AutoTokenizer, AutoModelForCausalLM >>> tokenizer = AutoTokenizer.from_pretrained("fnlp/moss-moon-003-sft-int4", trust_remote_code=True) >>> model = AutoModelForCausalLM.from_pretrained("fnlp/moss-moon-003-sft-int4", trust_remote_code=True).half().cuda() # 占用现存更小: model = AutoModelForCausalLM.from_pretrained("fnlp/moss-moon-003-sft-int4", trust_remote_code=True).half().quantize(4,-1).cuda() >>> model = model.eval() >>> meta_instruction =...
有在官方微信群的朋友吗?能拉我进去吗?谢谢 我的微信号是 xyeagle