MsJamie
MsJamie
I can confirm this as well ``` $ ./nanominer -d Detected 2 devices: GPU 0 PCI 09:00.0 8111 MB GeForce GTX 1070 Ti GPU 1 PCI 0a:00.0 5934 MB GeForce...
Sidenote. I just had some additional time to play around with this and found it only works in the configuration file if you set the devices= line within the block...
This works as described for me with a few epub files failing from `ERROR: Pandoc died with exitcode "64" during conversion: xmlns not in namespaces` and the Unexpected EOF on...
There seems to be a bug with llama-cpp-python 0.1.60 because I had the same issue on Ubuntu 22.04 as well Try `llama-cpp-python==0.1.59` that's the latest I was able to use...