dalai icon indicating copy to clipboard operation
dalai copied to clipboard

'./quantize' is not recognized as the name of a cmdlet, function, script file, or operable program

Open taaalha opened this issue 1 year ago • 25 comments

::OutputEncoding=[System.Console]::InputEncoding=[System.Text.Encoding]::UTF8; ./quantize

C:\Users\dalai\llama\models\7B\ggml-model-f16.bin C:\Users\dalai\llama\models\7B\ggml-model-q4_0.bin 2
./quantize : 

The term './quantize' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

At line:1 char:96
+ ... sole]::InputEncoding=[System.Text.Encoding]::UTF8; ./quantize C:\User ...
+                                                        ~~~~~~~~~~
    + CategoryInfo          : ObjectNotFound: (./quantize:String) [], CommandNotFoundException
    + FullyQualifiedErrorId : CommandNotFoundException

What could be the solution to this?

I was trying to install 7B model by npx dalai llama install 7B on Windows 10

taaalha avatar Mar 23 '23 08:03 taaalha

same error here.

dennis-gonzales avatar Mar 23 '23 09:03 dennis-gonzales

In my case, there was an error earlier on while running CMake in llama. For some reason it was expecting Visual Studio 15 2017 (and it couldn't find it). So I cleared CMakeCache and CMakeFiles and ran manually cmake -G "Visual Studio 16 2019" .

Then I reran the npx command, but I got stuck anyway on /quantize, this time with error: Cannot create process, error code: 267

MarPan avatar Mar 23 '23 13:03 MarPan

Same problem here (Windows 10). I had no issues with installing alpaca though.

lszoszk avatar Mar 23 '23 13:03 lszoszk

Same problem here

changquan avatar Mar 23 '23 13:03 changquan

I just fixed this on Windows Server 2019, but also works on Windows 11, I had to manually quantize it.

In command line as administrator. CD to your C:\Users\YOURUSER\dalai\llama\build\bin\Releases

then run ./quantize C:\Users\YOURUSER\dalai\llama\models\7B\ggml-model-f16.bin C:\Users\YOURUSER\dalai\llama\models\7B\ggml-model-q4_0.bin 2

this will unpack the llama model. I now have the model showing in the drop now for both llama and alpaca on windows

Moralizing avatar Mar 23 '23 15:03 Moralizing

@Moralizing Thanks. This worked like a charm.

However now there's another issue #245 . Basically nothing happens when you give it a prompt.

taaalha avatar Mar 23 '23 19:03 taaalha

I had the same issue on both Windows 10 and 11 (different machines, both VS 2022). I think it's a environment variable issue or it it should have changed directories to where the binary is. But basically, it appears to have had an issue finding quantize.exe (also I believe "./quantize" is invalid syntax for Windows). I just searched for "quantize" in the "%userprofile%\dalai" directory and replaced ./quantize with the full path to the EXE and everything worked.

aliasfoxkde avatar Mar 24 '23 04:03 aliasfoxkde

@taaalha this sounds like a bug in the install scripts so I think the issue should remain open until its resolved; just because there are workaround doesn't mean there isn't a problem

bradharms avatar Mar 24 '23 06:03 bradharms

I had the same issue on both Windows 10 and 11 (different machines, both VS 2022). I think it's a environment variable issue or it it should have changed directories to where the binary is. But basically, it appears to have had an issue finding quantize.exe (also I believe "./quantize" is invalid syntax for Windows). I just searched for "quantize" in the "%userprofile%\dalai" directory and replaced ./quantize with the full path to the EXE and everything worked.

I find it only under "%userprofile%\dalai\llama.devops\tools.sh". After removing the "./" before quantize command - it is still called the same way "./quantized" (also why is this issue closed?)

m4r71n85 avatar Mar 24 '23 12:03 m4r71n85

same error here.

PS C:\dalai\llama\build\Release> [System.Console]::OutputEncoding=[System.Console]::InputEncoding=[odels\7B\ggml-model-q4_0.bin 2/quantize c:\dalai\llama\models\7B\ggml-model-f16.bin c:\dalai\llama\mo ./quantize : O termo './quantize' não é reconhecido como nome de cmdlet, função, arquivo de script ou programa operável. Verifique a grafia do nome ou, se um caminho tiver sido incluído, veja se o caminho está correto e tente novamente. No linha:1 caractere:96

  • ... sole]::InputEncoding=[System.Text.Encoding]::UTF8; ./quantize c:\dala ...
  •                                                    ~~~~~~~~~~
    
    • CategoryInfo : ObjectNotFound: (./quantize:String) [], CommandNotFoundException
    • FullyQualifiedErrorId : CommandNotFoundException

mruthes avatar Mar 24 '23 16:03 mruthes

I had the same issue on both Windows 10 and 11 (different machines, both VS 2022). I think it's a environment variable issue or it it should have changed directories to where the binary is. But basically, it appears to have had an issue finding quantize.exe (also I believe "./quantize" is invalid syntax for Windows). I just searched for "quantize" in the "%userprofile%\dalai" directory and replaced ./quantize with the full path to the EXE and everything worked.

I find it only under "%userprofile%\dalai\llama.devops\tools.sh". After removing the "./" before quantize command - it is still called the same way "./quantized" (also why is this issue closed?)

I tried just 'quantized' and it couldn't find the binary, but providing the full path worked. But now I'm having a new issue (related) with the WebUI. With "Debug" enabled, it appears that the backslashes in the Windows paths are being removed when Powershell is called. I looked at the code but can't track down where this is happening, the regex and whatnot seems fine. I've had this issue on two machines and I'm going to check to see if it happens on a friends PC.

Might just be moving everything to Linux, which is fine and probably for the best.

aliasfoxkde avatar Mar 24 '23 17:03 aliasfoxkde

I had the same issue on both Windows 10 and 11 (different machines, both VS 2022). I think it's a environment variable issue or it it should have changed directories to where the binary is. But basically, it appears to have had an issue finding quantize.exe (also I believe "./quantize" is invalid syntax for Windows). I just searched for "quantize" in the "%userprofile%\dalai" directory and replaced ./quantize with the full path to the EXE and everything worked.

./command is actually how powershell prefers commands. it will not run them otherwise

RzNmKX avatar Mar 24 '23 19:03 RzNmKX

Navigate to the 'bin/releases' folder and then either open the terminal there or copy the necessary files to the 'release' folder

RaposaRale avatar Mar 24 '23 19:03 RaposaRale

Navigate to the 'bin/releases' folder and then either open the terminal there or copy the necessary files to the 'release' folder

This will fix this issue

RIAZAHAMMED avatar Mar 25 '23 06:03 RIAZAHAMMED

Had this problem for Llama. Alpaca worked fine for me and was able to play with it. I did mine using the docker route. Did it non-docker way first but it uses C drive and there's no more free space there. Learned about --home path later. For llama the docker way it fails while converting to ggml. I searched here and found the --home /path for non-docker way. Did that and it successfully converted to ggml but failed when calling quantize. No quantize command and no /bin/releases directory. So then I went into the docker container and found quantize there and ran the quantize from there then copy resulting file to /models/llama/models/7B.

chrismark avatar Mar 27 '23 11:03 chrismark

It's a bug in the install script.

xec: ./quantize C:\Users\XXX\dalai\llama\models\7B\ggml-model-f16.bin C:\Users\XXX\dalai\llama\models\7B\ggml-model-q4_0.bin 2 in C:\Users\XXX\dalai\llama\build\Release

The install script is running the command from C:\Users\XXX\dalai\llama\build\Release , but the quantize.exe was built in C:\Users\XXX\dalai\llama\build*bin*\Release (Microsoft Visual Studio 2022)

I opened a powershell in C:\Users\XXX\dalai and ran the command

.\llama\build\bin\Release\quantize.exe C:\Users\XXX\dalai\llama\models\7B\ggml-model-f16.bin C:\Users\XXX\dalai\llama\models\7B\ggml-model-q4_0.bin 2

It worked

dgasparri avatar Mar 27 '23 11:03 dgasparri

for me what solved the problem via cmd was. the command was in the cmd using the expression ./quantize to launch a quantize.exe application but inside the cmd to run an .exe program you should just type the name of the command. so removing ./ and typing only the app name did it successfully in cmd quantize C:\dalai\llama\models\7B\ggml-model-f16.bin C:\dalai\llama\models\7B\ggml-model-q4_0.bin 2

FoxPopBR avatar Apr 05 '23 18:04 FoxPopBR

right after the build process, and while the model is being downloaded, do these two steps:

  1. copy all files from .\llama\build\bin\Release\ to .\llama\build\Release\
  2. copy .\llama\build\Release\llama.exe to .\llama\build\Release\main.exe

The reason being some version of VC++ will build the output to .\llama\build\bin\Release\ instead of .\llama\build\Release\

Once the model is downloaded, the quantize.exe and the consecutive steps will run without issues.

final: $> npx dalai serve --home . (runs on localhost:3000)

akjoshi avatar Apr 09 '23 16:04 akjoshi

Can't quantize the ggml-model-q4_0.bin file for the 13B version:

llama_model_quantize: loading model from 'ggml-model-q4_0.bin' llama_model_quantize: failed to open 'ggml-model-q4_0.bin' for reading main: failed to quantize model from 'ggml-model-q4_0.bin'

But the others, ggml-model-f16.bin and ggml-model-f16.bin.1 work when quantized

I'm trying to quantize the models in this folder: C:\Users\MYUSER\dalai\llama\build\Release>

Mathuzala avatar Apr 26 '23 17:04 Mathuzala

In my case I have Visual Studio 2022 Profesionnal installed.

There is an issue to correct in the vcproj generation. The output folder for binaries (quantize.exe, llama.exe ..) has a 'bin' directory added, not expected by the Daila installation script:

image

Anthony-Breneliere avatar May 04 '23 15:05 Anthony-Breneliere

J'ai aussi VS 2022 mais je n'ai pas quantize

anom35 avatar May 04 '23 20:05 anom35

To fix the issue, in CMakeLists.txt:

set(CMAKE_RUNTIME_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/bin) => set(CMAKE_RUNTIME_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR})

Anthony-Breneliere avatar May 05 '23 01:05 Anthony-Breneliere

J'ai trouvé un moyen très simple juste en utilisant le gestionnaire de fichier, lorsque vous avez lancé npx ...... et qu'il est en train de télécharger 7b, alors vous faite un copier/coller des fichiers qui sont dans C:\Users\Utilisateur\dalai\llama\build\bin\Release et vous les mettez dans C:\Users\Utilisateur\dalai\llama\build\Release et la suite de l'installation continuera sans problème ;)

anom35 avatar May 05 '23 03:05 anom35

after the quantization runs the model still is not quantized, llama_model_quantize: loading model from 'F:\Dalai\llama\models\7B\ggml-model-f16.bin' llama_model_quantize: n_vocab = 32000 llama_model_quantize: n_ctx = 512 llama_model_quantize: n_embd = 4096 llama_model_quantize: n_mult = 256 llama_model_quantize: n_head = 32 llama_model_quantize: n_layer = 32 llama_model_quantize: f16 = 1 this comes and its finished without being quantized only the name ggml-model-q4_0.bin exist of 0Kb

syedmaaz9905 avatar May 20 '23 00:05 syedmaaz9905

It's a bug in the install script.

xec: ./quantize C:\Users\XXX\dalai\llama\models\7B\ggml-model-f16.bin C:\Users\XXX\dalai\llama\models\7B\ggml-model-q4_0.bin 2 in C:\Users\XXX\dalai\llama\build\Release

The install script is running the command from C:\Users\XXX\dalai\llama\build\Release , but the quantize.exe was built in C:\Users\XXX\dalai\llama\buildbin\Release (Microsoft Visual Studio 2022)

I opened a powershell in C:\Users\XXX\dalai and ran the command

.\llama\build\bin\Release\quantize.exe C:\Users\XXX\dalai\llama\models\7B\ggml-model-f16.bin C:\Users\XXX\dalai\llama\models\7B\ggml-model-q4_0.bin 2

It worked

Found the bug and fixed on my end on a windows machine: Line 120 on node_modules\dalai\llama.js

change from const bin_path = platform === "win32" ? path.resolve(this.home, "build", "Release") : this.home

to const bin_path = platform === "win32" ? path.resolve(this.home, "build", "bin", "Release") : this.home

LizsDing avatar Aug 28 '23 21:08 LizsDing