Bug: Issue building hipBLAS error: call to undeclared function '_mm256_dpbusd_epi32'
What happened?
Hi,
I'm trying to compile llama with the hipBLAS backed on CPU: 12th Gen Intel(R) Core(TM) i9-12900K GPU: AMD Radeon PRO W7800 (gfx1100) OS: Windows 11 23H2 With AMD HIP SDK 6.1.2 for Windows Installed: https://www.amd.com/en/developer/resources/rocm-hub/eula/licenses.html?filename=AMD-Software-PRO-Edition-24.Q3-Win10-Win11-For-HIP.exe
llama.cpp version: https://github.com/ggerganov/llama.cpp/releases/tag/b3828
When I run these commands
set PATH=%HIP_PATH%\bin;%PATH%
cmake -S . -B build -G Ninja -DAMDGPU_TARGETS=gfx1100 -DGGML_HIPBLAS=ON -DCMAKE_C_COMPILER=clang -DCMAKE_CXX_COMPILER=clang++ -DCMAKE_BUILD_TYPE=Release
cmake --build build
I get this error message, and the build is unable to continue:
llama.cpp-b3828/ggml/src/ggml-quants.c:107:34: error: call to undeclared function '_mm256_dpbusd_epi32'; ISO C99 and later do not support implicit function declarations [-Wimplicit-function-declaration]
const __m256i summed_pairs = _mm256_dpbusd_epi32(zero, ax, sy);
^
llama.cpp-b3828/ggml/src/ggml-quants.c:107:19: error: initializing 'const __m256i' (vector of 4 'long long' values) with an expre
Can someone tell me what I'm doing wrong?
Name and Version
CPU: 12th Gen Intel(R) Core(TM) i9-12900K GPU: AMD Radeon PRO W7800 (gfx1100) OS: Windows 11 23H2 With AMD HIP SDK 6.1.2 for Windows Installed: https://www.amd.com/en/developer/resources/rocm-hub/eula/licenses.html?filename=AMD-Software-PRO-Edition-24.Q3-Win10-Win11-For-HIP.exe
llama.cpp version: https://github.com/ggerganov/llama.cpp/releases/tag/b3828
What operating system are you seeing the problem on?
Windows
Relevant log output
[1/245] Building C object ggml/src/CMakeFiles/ggml.dir/ggml-quants.c.obj
FAILED: ggml/src/CMakeFiles/ggml.dir/ggml-quants.c.obj
ccache C:\PROGRA~1\AMD\ROCm\5.5\bin\clang.exe -DGGML_BUILD -DGGML_CUDA_DMMV_X=32 -DGGML_CUDA_MMV_Y=1 -DGGML_SCHED_MAX_COPIES=4 -DGGML_SHARED -DGGML_USE_CUDA -DGGML_USE_HIPBLAS -DGGML_USE_LLAMAFILE -DGGML_USE_OPENMP -DK_QUANTS_PER_ITERATION=2 -D_CRT_SECURE_NO_WARNINGS -D_XOPEN_SOURCE=600 -D__HIP_PLATFORM_AMD__=1 -D__HIP_PLATFORM_HCC__=1 -Dggml_EXPORTS -IC:/Users/owen/Desktop/Owen/llama.cpp-b3828/ggml/src/../include -IC:/Users/owen/Desktop/Owen/llama.cpp-b3828/ggml/src/. -isystem "C:/Program Files/AMD/ROCm/5.5/include" -O3 -DNDEBUG -D_DLL -D_MT -Xclang --dependent-lib=msvcrt -std=gnu11 -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wdouble-promotion -march=native -fopenmp=libomp -MD -MT ggml/src/CMakeFiles/ggml.dir/ggml-quants.c.obj -MF ggml\src\CMakeFiles\ggml.dir\ggml-quants.c.obj.d -o ggml/src/CMakeFiles/ggml.dir/ggml-quants.c.obj -c C:/Users/owen/Desktop/Owen/llama.cpp-b3828/ggml/src/ggml-quants.c
In file included from C:/Users/owen/Desktop/Owen/llama.cpp-b3828/ggml/src/ggml-quants.c:4:
In file included from C:/Users/owen/Desktop/Owen/llama.cpp-b3828/ggml/src/./ggml-quants.h:4:
C:/Users/owen/Desktop/Owen/llama.cpp-b3828/ggml/src/./ggml-common.h:62:9: warning: keyword is hidden by macro definition [-Wkeyword-macro]
#define static_assert(cond, msg) _Static_assert(cond, msg)
^
C:/Users/owen/Desktop/Owen/llama.cpp-b3828/ggml/src/ggml-quants.c:107:34: error: call to undeclared function '_mm256_dpbusd_epi32'; ISO C99 and later do not support implicit function declarations [-Wimplicit-function-declaration]
const __m256i summed_pairs = _mm256_dpbusd_epi32(zero, ax, sy);
^
C:/Users/owen/Desktop/Owen/llama.cpp-b3828/ggml/src/ggml-quants.c:107:19: error: initializing 'const __m256i' (vector of 4 'long long' values) with an expression of incompatible type 'int'
const __m256i summed_pairs = _mm256_dpbusd_epi32(zero, ax, sy);
^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
1 warning and 2 errors generated.