llama.cpp icon indicating copy to clipboard operation
llama.cpp copied to clipboard

Bug: Release build on Windows stuck

Open thewh1teagle opened this issue 5 months ago • 1 comments

What happened?

When building llama.cpp in Release mode it stuck on build. with Debug config it compiles fast.

Name and Version

1d1ccce67613674c75c9c7e3fa4c1e24e428ba48

What operating system are you seeing the problem on?

Windows

Relevant log output

Build

cmake -B build . -DCMAKE_BUILD_TYPE=Release
-- Building for: Visual Studio 17 2022
-- Selecting Windows SDK version 10.0.22000.0 to target Windows 10.0.22631.
-- The C compiler identification is MSVC 19.40.33812.0
-- The CXX compiler identification is MSVC 19.40.33812.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.40.33807/bin/Hostx64/x64/cl.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.40.33807/bin/Hostx64/x64/cl.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.45.2.windows.1")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - not found
-- Found Threads: TRUE
-- Found OpenMP_C: -openmp (found version "2.0")
-- Found OpenMP_CXX: -openmp (found version "2.0")
-- Found OpenMP: TRUE (found version "2.0")
-- OpenMP found
-- Using llamafile
-- Warning: ccache not found - consider installing it for faster compilation or disable this warning with GGML_CCACHE=OFF
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- CMAKE_GENERATOR_PLATFORM:
-- x86 detected
-- Performing Test HAS_AVX_1
-- Performing Test HAS_AVX_1 - Success
-- Performing Test HAS_AVX2_1
-- Performing Test HAS_AVX2_1 - Success
-- Performing Test HAS_FMA_1
-- Performing Test HAS_FMA_1 - Success
-- Performing Test HAS_AVX512_1
-- Performing Test HAS_AVX512_1 - Failed
-- Performing Test HAS_AVX512_2
-- Performing Test HAS_AVX512_2 - Failed
-- Configuring done (11.7s)
-- Generating done (0.9s)
-- Build files have been written to: D:/llama/llama.cpp/build
cmake --build build --config Release
MSBuild version 17.10.4+10fbfbf2e for .NET Framework

  build_info.vcxproj -> D:\llama\llama.cpp\build\common\build_info.dir\Release\build_info.lib
  Auto build dll exports
  ggml.vcxproj -> D:\llama\llama.cpp\build\bin\Release\ggml.dll
  llama.cpp
  llama-vocab.cpp
D:\llama\llama.cpp\src\llama-vocab.cpp(138,26): warning C4244: 'return': conversion from 'long' to 'uint8_t', possible loss of data
[D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-vocab.cpp(211,35): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data
[D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-vocab.cpp(211,30): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data
[D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-vocab.cpp(533,39): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data
[D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-vocab.cpp(533,34): warning C4267: 'argument': conversion from 'size_t' to 'int', possible loss of data
[D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-vocab.cpp(572,82): warning C4267: '=': conversion from 'size_t' to 'llm_symbol::index', possible loss o
f data [D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-vocab.cpp(575,61): warning C4267: '=': conversion from 'size_t' to 'int', possible loss of data [D:\lla
ma\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-vocab.cpp(669,37): warning C4267: 'initializing': conversion from 'size_t' to 'int', possible loss of d
ata [D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-vocab.cpp(669,25): warning C4267: 'initializing': conversion from 'size_t' to 'const int', possible los
s of data [D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-vocab.cpp(1532,20): warning C4267: 'return': conversion from 'size_t' to 'int32_t', possible loss of da
ta [D:\llama\llama.cpp\build\src\llama.vcxproj]
  llama-grammar.cpp
  llama-sampling.cpp
D:\llama\llama.cpp\src\llama-sampling.cpp(26,20): warning C4244: '=': conversion from 'time_t' to 'uint32_t', possible loss of data
[D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-sampling.cpp(70,23): warning C4267: '=': conversion from 'size_t' to 'int32_t', possible loss of data [
D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-sampling.cpp(405,33): warning C4244: '=': conversion from 'double' to 'float', possible loss of data [D
:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-sampling.cpp(409,34): warning C4244: '/=': conversion from 'double' to 'float', possible loss of data [
D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-sampling.cpp(510,34): warning C4244: 'initializing': conversion from 'float' to 'int32_t', possible los
s of data [D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-sampling.cpp(510,27): warning C4244: 'initializing': conversion from 'float' to 'const int32_t', possib
le loss of data [D:\llama\llama.cpp\build\src\llama.vcxproj]
D:\llama\llama.cpp\src\llama-sampling.cpp(530,61): warning C4244: 'argument': conversion from 'const int32_t' to 'float', possible l
oss of data [D:\llama\llama.cpp\build\src\llama.vcxproj]
  unicode.cpp
  unicode-data.cpp
  Generating Code...
  Forever...

Update

Using the following:

cmake -B build . -DCMAKE_BUILD_TYPE=Release -DCMAKE_CXX_FLAGS="/Od"
cmake --build build --config Release --target llama-cli

fixed the issue. It compiled in 116 seconds.

thewh1teagle avatar Aug 29 '24 16:08 thewh1teagle