Ravindra Marella

Results 63 comments of Ravindra Marella

You can update `requests` as well: ```sh pip install requests==2.31.0 ```

I [fixed](https://github.com/marella/chatdocs/commit/68250c8434f9334cde090a860813658240242882) this error in repo but haven't released yet. Running the following should fix it: ```py pip install pydantic==1.10.12 ```

I created the following npm packages that are automatically updated and published using GitHub Actions. So they will always have the latest icons. # [material-icons](https://github.com/marella/material-icons#readme) [![npm](https://img.shields.io/npm/v/material-icons)](https://www.npmjs.com/package/material-icons) [![install size](https://packagephobia.com/badge?p=material-icons)](https://packagephobia.com/result?p=material-icons) [![Downloads](https://img.shields.io/npm/dm/material-icons)](https://www.npmjs.com/package/material-icons) -...

I'm guessing this is related to https://github.com/PanQiWei/AutoGPTQ/issues/115#issuecomment-1581121864 where the `autogptq_cuda` directory isn't being uploaded to PyPI.

I'm getting the same error on Google Colab GPU runtime. Upon using verbose option `pip install auto-gptq -v` it says `autogptq_cuda/autogptq_cuda.cpp` file is missing: ``` x86_64-linux-gnu-gcc: error: autogptq_cuda/autogptq_cuda.cpp: No such...

You can install with CUDA extension using: ```sh pip install git+https://github.com/PanQiWei/[email protected] ``` It is similar to what TheBloke wrote but everything is handled by pip. Here is how I'm specifying...

Hi, installation also fails if the CUDA toolkit version doesn't match the CUDA version PyTorch is compiled with. In such cases is it possible to disable building extension and log...

> However I am not sure if that is definitely always the problem? Because on that system above, I can install auto-gptq no problem? On my system I get the...

I was also thinking of suggesting something similar. For example, `llama_sample_*` functions can be reused by other models. Author might already have similar plans https://github.com/ggerganov/ggml/pull/145#issuecomment-1544733902

Hey, saw this just now. I just created #246 for handling special tokens 😅