阿丹(adan)
阿丹(adan)
File "D:\anoconda\envs\opennre\lib\multiprocessing\spawn.py", line 116, in spawn_main exitcode = _main(fd, parent_sentinel) File "D:\anoconda\envs\opennre\lib\multiprocessing\spawn.py", line 125, in _main prepare(preparation_data) File "D:\anoconda\envs\opennre\lib\multiprocessing\spawn.py", line 236, in prepare _fixup_main_from_path(data['init_main_from_path']) File "D:\anoconda\envs\opennre\lib\multiprocessing\spawn.py", line 287, in _fixup_main_from_path...
请问包级的是用的哪一个?是否在这个package里面给了?另外这个基于BERT的包级关系提取任务的准确率和F1分数分别达到多少了,我在论文中没有找到相关数据? 非常感谢您的分享,在论文中已强烈推荐
MiniCPM has good effects and small parameters, and can be well combined with chat-RTX, so please support minicpm
### What happened? When I use android NDK to compile the bin file on Linux, and then transfer the bin file to Android termux and execute it, an error will...
Hello, I am the open source staff member of openbmb. The openbmb/minicpm series has a very large audience in China. Many of our users hope that ollama can officially support...
openbmb/minicpm is one of best of small model in China,I want to pull request to support minicpm,Can I please do this?
Wǒ shì zhōngguó openbmb shíyàn shì de gōngzuò rényuán, zhège pull request zēngjiāle auto_gptq duì openbmb/minicpm de zhīchí, rúguǒ néng dádào nín de yāoqiú, qǐng hébìng 79 / 5,000 I...
Added support for minicpmv2.6. If you only infer the awq quantification model of minicpmv, you can use it directly. However, if you need to quantify minicpmv2.6, you need to replace...
I want to use awq quantize a model, and use llama.cpp convert to gguf. but I followed the tutorial but got an error:Traceback (most recent call last): File "/root/ld/ld_project/llama.cpp/convert_minicpm.py", line...
Dear author, I am a staff member of the openbmb community. I would like to add a pull request to support the openbmb/minicpm model, but I find that I do...