Javed
Javed
There is now also the X version, see: https://vallex-demo.github.io/
I also can't find it, instead _convert-lora-to-ggml.py, convert-pth-to-ggml.py_ and _convert.py_ are found
I made some tests, the python bindings of [https://github.com/abetlen/llama-cpp-python](https://github.com/abetlen/llama-cpp-python) are not the same as the python bindings of GPT4All [https://github.com/nomic-ai/pygpt4all](https://github.com/nomic-ai/pygpt4all). That is, it is not possible to load a ggml...
you may want to have a look at: [https://github.com/nomic-ai/gpt4all/issues/468](https://github.com/nomic-ai/gpt4all/issues/468)
Unfortunately, I am not able to get the subfigure displayed. I installed: pip install sphinx-subfigure and in modified the _config.yml as shown below sphinx: extra_extensions: - sphinx_subfigure However, I don't...
The question has become irrelevant for. However, thank you
Just one comment from my side. I was using regular figures and the subfigures in one file and got an issue. To resolve the issue, I had to modify the...
I was looking forward to using NLLB, but then saw > All models are licensed under CC-BY-NC 4.0 If you have a look at [https://www.gnu.org/licenses/gpl-faq.en.html#GPLOutput](https://www.gnu.org/licenses/gpl-faq.en.html#GPLOutput), you see > In general...
Just to have it mentioned once, even though I am aware of [https://www.gnu.org/licenses/gpl-faq.en.html#GPLOutput](https://www.gnu.org/licenses/gpl-faq.en.html#GPLOutput), I still do not feel comfortable using the models. Obviously, the intention of Meta or the developer...
This is something I would also like to know.