Oleh Kuznetsov

Results 12 comments of Oleh Kuznetsov

The issue persists with new ggmlv3 quantized models. Tested using manticore-13B (https://huggingface.co/openaccess-ai-collective/manticore-13b). However, the evaluation time is now a little bit reduced since this new standard is faster and more...

Looks like the issue was in misunderstanding of the `n_batch` parameter of the LLamaCpp wrapper. The default value is 8, which is a kind of small number if you want...

In order to ensure that the same issue wont appear in the future, I updated the demonstration notebook of llamacpp model integration (with #5344). The issue now can be closed.

Support of ChatQA would be really nice, the model seems to be quite useful.

I appreciate your work, sir. Looking forward to trying this out. Would be amazing to run local ggml models with guidance.

Is it possible to choose OpenAI model for RAGAS? It seems like the code itself doesn't support this. Am i wrong?

Same issue here. WSL2. Python 3.12.5. Unable to install the package.

@I8dNLo. Maybe this one will help. Seems to be also present in python 3.12. error: Failed to prepare distributions Caused by: Failed to fetch wheel: pystemmer==2.2.0.1 Caused by: Build backend...

Hello! Can confirm, started happening after recent updates (updated from 177 to 179 today). LSP auto-completions for python are essentially unusable with pylsp. Works as expected with pyright.