BothSavage

Results 8 comments of BothSavage

https://bothsavage.github.io/article/240119-notion

> 楼主当前在pages上部署成功了吗? https://bothsavage.github.io/article/240119-notion

> Does anyone has the error below.I have already unrar , and mkdir weights > ValueError: The passed save_path is not a valid checkpoint: models/weights/cpm_hand ![~%EART (_$13$RU~P5K H5D](https://user-images.githubusercontent.com/38107458/77247410-efe0a980-6c6b-11ea-8d5e-4aad5afdc9bc.png) the right...

在mac下已经解决了,输出博客:https://bothsavage.github.io/article/240810-minicpm2.6 提交pr:https://github.com/OpenBMB/MiniCPM-V/pull/461 修改web_demo_2.6.py文件 ``` # fix the imports def fixed_get_imports(filename: Union[str, os.PathLike]) -> list[str]: imports = get_imports(filename) if not torch.cuda.is_available() and "flash_attn" in imports: imports.remove("flash_attn") return imports 。。。。。。。。。。 with patch("transformers.dynamic_module_utils.get_imports",...

全局搜索下这个文件,复制这个文件夹路径。然后在llm/llama.cpp/CMakeLists.txt中添加 ``` # Include directories include_directories(/dir_path) ```

我这样是可以写后go generate ./...没有问题了 但是我下一步go build . 报错了

MACOS版本:版本15.0 Beta版(24A5279h) || 版本15.1 Beta版(24B5009l) git版本:58a14c37 1. go generate ./...问题的解决就是在CMakeLists.txt中include对应的文件就OK 2. go build . 问题报错,export CGO_LDFLAGS="-framework Accelerate"这样就行 3.最后运行ollama版本的时候出现 llama_get_logits_ith: invalid logits id 10, reason: no logits ![image](https://github.com/user-attachments/assets/801607eb-5855-489f-9fd4-0d284a477a66) #432

> 还是不行,CMakeLists.txt中include对应的文件,一执行go generate ./...后,CMakeLists.txt刚加的那句include就自动没了,然后还是报这个错 奇了怪了,你试试webdemo吧,ollama最终都会报错invalid logits id 10, reason: no logits