Ati Sharma
Ati Sharma
### System Info langchain 0.0.173 faiss-cpu 1.7.4 python 3.10.11 Void linux ### Who can help? @hwchase17 ### Information - [ ] The official example notebooks/scripts - [X] My own modified...
At the moment, an error in the executed program dumps the whole traceback to stdout using `print`. https://github.com/microsoft/guidance/blob/d6b855aa625677f806fc51ec7238d2a38df594ea/guidance/_program_executor.py#L104 If you're using a terminal-based program, or handle errors yourself, this is...
Hi. Nice project. I have made a similar tool (not knowing yours existed), [searchtobibtex](https://github.com/atisharma/searchtobibtex). 1. Yours is more nicely written (mine is in bash) 2. You may wish to copy/take...
Support [min_p sampler](https://old.reddit.com/r/LocalLLaMA/comments/17vonjo/your_settings_are_probably_hurting_your_model_why/), which is implemented in ExLlamav2.-
It's not clear from the documentation how to split VRAM over multiple GPUs with exllama.
I came across a strange error causing problems viewing the docstring of a module. Below is my setup and a minimum working example. ```bash > python -V Python 3.11.7 >...
Thanks for sharing this software. It would be helpful if you also posted a link to your MSc thesis, to help understand what you've done and find references etc.
Support for the openjdk docker images is [being dropped](https://hub.docker.com/_/openjdk) and debian bullseye is now 'oldstable'. Do you have plans to move to debian bookworm and a supported openjdk image provider?
There's no reason LLMs can't act as agents / players themselves. It would be fun to enable bots.