MiniCPM-V icon indicating copy to clipboard operation
MiniCPM-V copied to clipboard

[ollama] - Go build fails with undefined reference to `GOMP_parallel'

Open goto-loop opened this issue 1 year ago • 9 comments

起始日期 | Start Date

No response

实现PR | Implementation PR

No response

相关Issues | Reference Issues

No response

摘要 | Summary

When following the instructions for running MiniCPM-V 2.6 with Ollama, the go build . step exits with the following error:

# github.com/ollama/ollama
/usr/lib/go-1.22/pkg/tool/linux_amd64/link: running gcc failed: exit status 1
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_compute_forward_conv_transpose_1d_f16_f32':
ggml.c:(.text+0x2b31): undefined reference to `GOMP_barrier'
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_compute_forward_conv_transpose_1d_f32':
ggml.c:(.text+0x3189): undefined reference to `GOMP_barrier'
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_compute_forward_conv_transpose_2d':
ggml.c:(.text+0x36e9): undefined reference to `GOMP_barrier'
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_compute_forward_out_prod_q_f32':
ggml.c:(.text+0x400b): undefined reference to `GOMP_barrier'
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_compute_forward_flash_attn_back_f32':
ggml.c:(.text+0x5781): undefined reference to `GOMP_barrier'
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o):ggml.c:(.text+0x771e): more undefined references to `GOMP_barrier' follow
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_graph_compute._omp_fn.0':
ggml.c:(.text+0x3b87d): undefined reference to `GOMP_single_start'
/usr/bin/ld: ggml.c:(.text+0x3b886): undefined reference to `omp_get_num_threads'
/usr/bin/ld: ggml.c:(.text+0x3b894): undefined reference to `GOMP_barrier'
/usr/bin/ld: ggml.c:(.text+0x3b8a1): undefined reference to `omp_get_thread_num'
/usr/bin/ld: /home/user/ollama/llm/build/linux/x86_64_static/libllama.a(ggml.c.o): in function `ggml_graph_compute':
ggml.c:(.text+0x3ed3d): undefined reference to `GOMP_parallel'
collect2: error: ld returned 1 exit status

I'm running Ubuntu with the following versions:

  • cmake 3.28.3
  • go 1.22.2
  • gcc 13.2.0

基本示例 | Basic Example

缺陷 | Drawbacks

未解决问题 | Unresolved questions

No response

goto-loop avatar Aug 07 '24 11:08 goto-loop

Okay, I'll check it tomorrow

tc-mb avatar Aug 07 '24 13:08 tc-mb

I try to build on Mac, and follow the instructions and get a similar error.

'''

github.com/ollama/ollama

/opt/homebrew/Cellar/go/1.22.6/libexec/pkg/tool/darwin_arm64/link: running cc failed: exit status 1 Undefined symbols for architecture arm64: "_cblas_sgemm", referenced from: ggml_backend_blas_graph_compute(ggml_backend*, ggml_cgraph*) in libllama.a9 ggml_backend_blas_graph_compute(ggml_backend*, ggml_cgraph*) in libllama.a9 "_openblas_set_num_threads", referenced from: ggml_backend_blas_graph_compute(ggml_backend*, ggml_cgraph*) in libllama.a9 ld: symbol(s) not found for architecture arm64 clang: error: linker command failed with exit code 1 (use -v to see invocation) '''

WangchenhaoHZ avatar Aug 08 '24 02:08 WangchenhaoHZ

我也有这个问题 image

catsled avatar Aug 08 '24 07:08 catsled

docker 环境:ubuntu 22.04

构建 https://github.com/OpenBMB/ollama/tree/minicpm-v2.5/examples/minicpm-v2.5 就出现过。不清楚是不是少了什么lib。 在非docker环境中是正常的

shishaoqi avatar Aug 08 '24 08:08 shishaoqi

你们最后是怎么解决的啊

dingtine avatar Aug 08 '24 09:08 dingtine

"ggml.c:(.text+0x3f737): undefined reference to `GOMP_parallel'" 同样的问题+1, 在线等 !!

ppkliu avatar Aug 08 '24 09:08 ppkliu

I find why cannot be built on Mac. As apple decide to provide its own blas, the third party header files are not supported now. If you want to build the binary, you need upgrade to the preview macOS 15. image

WangchenhaoHZ avatar Aug 08 '24 10:08 WangchenhaoHZ

I added this in the cgo comments section of llm/llm.go: // #cgo LDFLAGS: -fopenmp And the above error disappears. I hope it can be used as a reference.

cpp-qn avatar Aug 09 '24 05:08 cpp-qn

This seems to fix the build for me, thanks for sharing the workaround @guoQiNing!

However, when I try to use the model now I run into the same issue as described in #393 😅

goto-loop avatar Aug 09 '24 07:08 goto-loop

It seems that this is caused by an environment problem. This is indeed much harder to debug than a code problem.

Thanks to @guoQiNing for the help.

This seems to be caused by the lack of environment dependencies in the lunux environment. This dependency is originally included in the ollama code. I am not sure whether it can be ignored directly.

  1. This environment may not exist under Linux.
  2. It may also be that this environment is installed, but this dependency cannot be found in the environment when executing the command.

tc-mb avatar Aug 14 '24 10:08 tc-mb

屏幕截图 2024-08-15 172358 easy to resolve

orangetooth avatar Aug 15 '24 09:08 orangetooth

把win系统上面的ollama关了

Gutilence14 avatar Aug 22 '24 08:08 Gutilence14