Make running fastLLaMa on windows simple!
Discussed in https://github.com/PotatoSpudowski/fastLLaMa/discussions/26
Originally posted by McRush March 27, 2023 Hello! I find this project really cool. FastLLama has separate functions for prompt ingesting and text generation unlike other solutions. The ability to save and load model states is also very nice. I really would like to try it. Unfortunately, now I'm bound to my Windows workflow. I would consider using WSL or install Linux, but building for Windows would be a preferable way for now.
I'm not familiar with C/C++, so I just tried to generate VisualStudio files from CMake and then compile them. Miracle didn't happen and compilation failed. I suspect that some preparation is necessary here. Maybe I shouldn't use VS at all. If somebody is ready to provide any instructions, I would very very appreciate it.
I'm personally stuck at
ERR_PNPM_NO_PKG_MANIFEST No package.json found in C:\Users\user\Desktop\Llama\fastLLaMa
after pnpm install