Rickey Bowers Jr.

Results 53 comments of Rickey Bowers Jr.

I was able to build with clang (from VS2022 prompt), without any changes: ```bash clang -march=native -O3 -fuse-ld=lld-link -flto main.cpp ggml.c utils.cpp clang -march=native -O3 -fuse-ld=lld-link -flto quantize.cpp ggml.c utils.cpp...

Assuming you are at a VS2022 command prompt and you've installed `git`/`cmake` support through the VS Installer: ```cmd set PATH=%DevEnvDir%CommonExtensions\Microsoft\TeamFoundation\Team Explorer\Git\cmd;%PATH% git clone https://github.com/ggerganov/llama.cpp.git cd llama.cpp mkdir build cd build...

@eous Nice work, not a single crash since the patch.

On Windows I just use CMD files. I can just drop a prompt file on the CMD script, or use the terminal. Here is an example: ```cmd @REM explicitly state...

I made a CMD script, but Python is more sensible considering it's already a requirement. ```cmd @( SETLOCAL EnableDelayedExpansion ECHO --------------------------------------------------------------- ECHO convert and quantize facebook LLaMA models for use...

#174 also asked this, or do you have something else in mind?

I can't find it now, but @ggerganov said save/restore of the k&v tensors would preserve the state, iirc. https://github.com/ggerganov/llama.cpp/blob/721311070e31464ac12bef9a4444093eb3eaebf7/main.cpp#L79-L82

Presently, this is the only way to get `sentencepiece` working on Windows with Python 3.11 without building from source - which is many more hurdles.

Some games hide the problem by making other displays black (No Man Sky, for example) - it is indeed an industry wide problem to universally support varied displays in an...

Here is a beautiful cheatsheet ... https://www.officedaytime.com/simd512e/ (Search "pop" and click through links for detailed info!)