elkay
elkay
Same as previous person, except on Amazon Linux. CUDA is installed correctly and available. ``` (textgen) [ec2-user@ip-10-0-1-88 vllm]$ python3 -c "import torch; print('Using device: ', torch.device('cuda' if torch.cuda.is_available() else 'cpu'))";...
FYI this process installs a non-CUDA version of torch by default. If you have and want to use CUDA, you'll need to finish up with uninstalling torch and installing an...
I am having the same issue. Hoping this isn't a dead end for me and a bunch of work I'll have to throw away.
Ok, was able to fix the error. Not sure why everything was working fine without this until suddenly the @auth/express package was added, but after starting a new project and...
The project one. Didn't have to change anything else, it just started working after that.
> Oh bummer. > > Supports the following architectures. > > https://github.com/Anush008/fasterembed/tree/main/npm Yeah, that's this issue I was reporting haha. No Linux ARM. So I guess no workaround right now?
Ok cool, thanks! :-) The AWS t4g offerings are their modern EC2 offerings and both cost less and outperform their predecessors, so it would be great to be able to...
Hey just an FYI, when I see builds for those servers it's usually "aarch64" support. Not sure if there is a difference, maybe not, but thought I would clarify.
I was able to get this built using Python 3.9 in a Conda environment, however I now cannot get it to install. Does this require Python 3.9 to run? Basically,...
> @elkay - I'm not sure I follow, are you able to use the much newer whl? If so, is there a reason that you are trying to install with...