MiniCPM-V
MiniCPM-V copied to clipboard
[ollama] - Go build fails with undefined reference to `GOMP_parallel'
起始日期 | Start Date
No response
实现PR | Implementation PR
No response
相关Issues | Reference Issues
No response
摘要 | Summary
When following the instructions for running MiniCPM-V 2.6 with Ollama, the go build . step exits with the following error:
# github.com/ollama/ollama
/usr/lib/go-1.22/pkg/tool/linux_amd64/link: running gcc failed: exit status 1
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_compute_forward_conv_transpose_1d_f16_f32':
ggml.c:(.text+0x2b31): undefined reference to `GOMP_barrier'
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_compute_forward_conv_transpose_1d_f32':
ggml.c:(.text+0x3189): undefined reference to `GOMP_barrier'
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_compute_forward_conv_transpose_2d':
ggml.c:(.text+0x36e9): undefined reference to `GOMP_barrier'
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_compute_forward_out_prod_q_f32':
ggml.c:(.text+0x400b): undefined reference to `GOMP_barrier'
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_compute_forward_flash_attn_back_f32':
ggml.c:(.text+0x5781): undefined reference to `GOMP_barrier'
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o):ggml.c:(.text+0x771e): more undefined references to `GOMP_barrier' follow
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_graph_compute._omp_fn.0':
ggml.c:(.text+0x3b87d): undefined reference to `GOMP_single_start'
/usr/bin/ld: ggml.c:(.text+0x3b886): undefined reference to `omp_get_num_threads'
/usr/bin/ld: ggml.c:(.text+0x3b894): undefined reference to `GOMP_barrier'
/usr/bin/ld: ggml.c:(.text+0x3b8a1): undefined reference to `omp_get_thread_num'
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_graph_compute':
ggml.c:(.text+0x3ed3d): undefined reference to `GOMP_parallel'
collect2: error: ld returned 1 exit status
I'm running Ubuntu with the following versions:
- cmake 3.28.3
- go 1.22.2
- gcc 13.2.0
基本示例 | Basic Example
缺陷 | Drawbacks
未解决问题 | Unresolved questions
No response
Okay, I'll check it tomorrow
I try to build on Mac, and follow the instructions and get a similar error.
'''
github.com/ollama/ollama
/opt/homebrew/Cellar/go/1.22.6/libexec/pkg/tool/darwin_arm64/link: running cc failed: exit status 1 Undefined symbols for architecture arm64: "_cblas_sgemm", referenced from: ggml_backend_blas_graph_compute(ggml_backend*, ggml_cgraph*) in libllama.a9 ggml_backend_blas_graph_compute(ggml_backend*, ggml_cgraph*) in libllama.a9 "_openblas_set_num_threads", referenced from: ggml_backend_blas_graph_compute(ggml_backend*, ggml_cgraph*) in libllama.a9 ld: symbol(s) not found for architecture arm64 clang: error: linker command failed with exit code 1 (use -v to see invocation) '''
我也有这个问题
docker 环境:ubuntu 22.04
构建 https://github.com/OpenBMB/ollama/tree/minicpm-v2.5/examples/minicpm-v2.5 就出现过。不清楚是不是少了什么lib。 在非docker环境中是正常的
你们最后是怎么解决的啊
"ggml.c:(.text+0x3f737): undefined reference to `GOMP_parallel'" 同样的问题+1, 在线等 !!
I find why cannot be built on Mac. As apple decide to provide its own blas, the third party header files are not supported now. If you want to build the binary, you need upgrade to the preview macOS 15.
I added this in the cgo comments section of llm/llm.go: // #cgo LDFLAGS: -fopenmp And the above error disappears. I hope it can be used as a reference.
This seems to fix the build for me, thanks for sharing the workaround @guoQiNing!
However, when I try to use the model now I run into the same issue as described in #393 😅
It seems that this is caused by an environment problem. This is indeed much harder to debug than a code problem.
Thanks to @guoQiNing for the help.
This seems to be caused by the lack of environment dependencies in the lunux environment. This dependency is originally included in the ollama code. I am not sure whether it can be ignored directly.
- This environment may not exist under Linux.
- It may also be that this environment is installed, but this dependency cannot be found in the environment when executing the command.
easy to resolve
把win系统上面的ollama关了