How can I install on window11?
i tried to install bitnet on window11 with intel 13th i5 and i got this log
build_info.vcxproj -> C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\common\build_info.dir\Release\build_info.lib
In file included from C:\Users\User\Desktop\BitNet\src\ggml-bitnet-lut.cpp:10:
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(1127,54): warning : 'backend' is deprecated: use the buffer type to find the storage location of the tensor [-Wdeprecated-declarations] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\include\ggml.h(585,9): message : 'backend' has been explicitly marked deprecated here [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\include\ggml.h(194,52): message : expanded from macro 'GGML_DEPRECATED' [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\src\ggml-bitnet-lut.cpp(137,15): warning : 'backend' is deprecated: use the buffer type to find the storage location of the tensor [-Wdeprecated-declarations] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\include\ggml.h(585,9): message : 'backend' has been explicitly marked deprecated here [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\include\ggml.h(194,52): message : expanded from macro 'GGML_DEPRECATED' [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
In file included from C:\Users\User\Desktop\BitNet\src\ggml-bitnet-lut.cpp:10:
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(279,19): warning : unused variable 'vec_sign_mask' [-Wunused-variable] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(1046,17): message : in instantiation of function template specialization 'three_qgemm_lut_3200_8640<1>' requested here [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(280,19): warning : unused variable 'vec_zero' [-Wunused-variable] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(281,19): warning : unused variable 'vec_one' [-Wunused-variable] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(523,19): warning : unused variable 'vec_sign_mask' [-Wunused-variable] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(1078,17): message : in instantiation of function template specialization 'three_qgemm_lut_3200_3200<1>' requested here [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(524,19): warning : unused variable 'vec_zero' [-Wunused-variable] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(525,19): warning : unused variable 'vec_one' [-Wunused-variable] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(767,19): warning : unused variable 'vec_sign_mask' [-Wunused-variable] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(1110,17): message : in instantiation of function template specialization 'three_qgemm_lut_8640_3200<1>' requested here [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(768,19): warning : unused variable 'vec_zero' [-Wunused-variable] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(769,19): warning : unused variable 'vec_one' [-Wunused-variable] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(19,13): warning : unused function 'aligned_free' [-Wunused-function] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(276,13): warning : loop not unrolled: the optimizer was unable to perform the requested transformation; the transformation might be disabled or specified as part of an unsupported transformation ordering [-Wpass-failed=transform-warning] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(657,16): warning : loop not unrolled: the optimizer was unable to perform the requested transformation; the transformation might be disabled or specified as part of an unsupported transformation ordering [-Wpass-failed=transform-warning] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(520,13): warning : loop not unrolled: the optimizer was unable to perform the requested transformation; the transformation might be disabled or specified as part of an unsupported transformation ordering [-Wpass-failed=transform-warning] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(901,16): warning : loop not unrolled: the optimizer was unable to perform the requested transformation; the transformation might be disabled or specified as part of an unsupported transformation ordering [-Wpass-failed=transform-warning] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\ggml\src\..\..\..\..\include\bitnet-lut-kernels.h(764,13): warning : loop not unrolled: the optimizer was unable to perform the requested transformation; the transformation might be disabled or specified as part of an unsupported transformation ordering [-Wpass-failed=transform-warning] [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\ggml\src\ggml.vcxproj]
Auto build dll exports
ggml.vcxproj -> C:\Users\User\Desktop\BitNet\build\bin\Release\ggml.dll
Auto build dll exports
llama.vcxproj -> C:\Users\User\Desktop\BitNet\build\bin\Release\llama.dll
llava.vcxproj -> C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\examples\llava\llava.dir\Release\llava.lib
sha1.vcxproj -> C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\examples\gguf-hash\sha1.dir\Release\sha1.lib
sha256.vcxproj -> C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\examples\gguf-hash\sha256.dir\Release\sha256.lib
xxhash.vcxproj -> C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\examples\gguf-hash\xxhash.dir\Release\xxhash.lib
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\common\common.cpp(445,32): error : no type named 'system_clock' in namespace 'std::chrono' [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\common\common.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\common\common.cpp(447,11): error : 'clock' is not a class, namespace, or enumeration [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\common\common.vcxproj]
C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\time.h(144,26): message : 'clock' declared here [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\common\common.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\common\common.cpp(447,44): error : 'clock' is not a class, namespace, or enumeration [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\common\common.vcxproj]
C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\time.h(144,26): message : 'clock' declared here [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\common\common.vcxproj]
C:\Users\User\Desktop\BitNet\3rdparty\llama.cpp\common\common.cpp(448,30): error : 'clock' is not a class, namespace, or enumeration [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\common\common.vcxproj]
C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\time.h(144,26): message : 'clock' declared here [C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\common\common.vcxproj]
llama-gguf.vcxproj -> C:\Users\User\Desktop\BitNet\build\bin\Release\llama-gguf.exe
llama-gguf-hash.vcxproj -> C:\Users\User\Desktop\BitNet\build\bin\Release\llama-gguf-hash.exe
llama-quantize-stats.vcxproj -> C:\Users\User\Desktop\BitNet\build\bin\Release\llama-quantize-stats.exe
llama-simple.vcxproj -> C:\Users\User\Desktop\BitNet\build\bin\Release\llama-simple.exe
llava_shared.vcxproj -> C:\Users\User\Desktop\BitNet\build\bin\Release\llava_shared.dll
llava_static.vcxproj -> C:\Users\User\Desktop\BitNet\build\3rdparty\llama.cpp\examples\llava\Release\llava_static.lib
and
ERROR:root:Error occurred while running command: Command '['cmake', '--build', 'build', '--config', 'Release']' returned non-zero exit status 1., check details in logs\compile.log
i try #180 but i failed what else i can do?
i also got this messeage
Traceback (most recent call last): File "C:\Users\User\desktop\bitnet\setup_env.py", line 232, in <module> main() File "C:\Users\User\desktop\bitnet\setup_env.py", line 209, in main compile() File "C:\Users\User\desktop\bitnet\setup_env.py", line 193, in compile cmake_exists = subprocess.run(["cmake", "--version"], capture_output=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\User\anaconda3\Lib\subprocess.py", line 548, in run with Popen(*popenargs, **kwargs) as process: ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\User\anaconda3\Lib\subprocess.py", line 1026, in __init__ self._execute_child(args, executable, preexec_fn, close_fds, File "C:\Users\User\anaconda3\Lib\subprocess.py", line 1538, in _execute_child hp, ht, pid, tid = _winapi.CreateProcess(executable, args, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ FileNotFoundError: [WinError 2] 지정된 파일을 찾을 수 없습니다
can you show me compile.log ?
Between lines 38 and 44, it looks like you're missing an include or there's a naming conflict. The code is trying to use std::chrono::system_clock, but it can't find it. Later errors suggest a conflict with the clock symbol from <time.h>.
You need to add #include <chrono> to your code. You may also need to update the following files: common.cpp, log.cpp, imatrix.cpp, and perplexity.cpp.
You also need to change the code in common.cpp:
using clock = std::chrono::system_clock;
const clock::time_point current_time = clock::now();
const time_t as_time_t = clock::to_time_t(current_time);
To this:
const std::chrono::system_clock::time_point current_time = std::chrono::system_clock::now();
const time_t as_time_t = std::chrono::system_clock::to_time_t(current_time);
Also, remember to always use the Developer Command Prompt or PowerShell for Visual Studio 2022 when running the following commands.
oh thanks a lot!
Use WSL and clang to compile
FAQ (Frequently Asked Questions)📌 Q1: The build dies with errors building llama.cpp due to issues with std::chrono in log.cpp? A: This is an issue introduced in recent version of llama.cpp. Please refer to this commit in the https://github.com/abetlen/llama-cpp-python/issues/1942 to fix this issue.
In my environment, a new line set(CMAKE_CXX_STANDARD 20) needed at the L20 of CMakeLists.txt even if I am following the FAQ for the 3 additional #include <chrono> .
VS 2022, Win 11