Results 2 issues of Mehmet

Currently LLM-VM does not support multiple GPU setups. Using runpod, I rented a setup with 2 RTX 3090 GPUs. Well running the local Bloom model example from the [docs](https://anarchy.ai/get_started/quickstart/completions). I...

bug
feat/enhancement
HIGH-PRIORITY
improvement

**Describe the bug** I ran into a bug well trying to use [Bend](https://github.com/HigherOrderCO/Bend). Well trying to run Bend in "cuda mode" I get the following error: ``` Error reading result...