Yorkie

Results 9 comments of Yorkie

@iEddie-cmd gfx906 is not supported on the ROCM runtime in LM Studio GFX1030, 1100 and 1101 are the supported AMD GPU's for ROCM in LM Studio, you will need to...

@gbschenkel 6800XT should work with ROCM. Can you open up a new issue in: https://github.com/lmstudio-ai/lmstudio-bug-tracker With the following LM Studio screenshots: LM Hardware (CTRL + SHIFT + H) LM Runtimes...

@timtak the `AMD Phenom II X6 1060T` CPU is quite old and lacks AVX2 instructions which are a hard requirement for running models in LM Studio: https://lmstudio.ai/docs/system-requirements

@ibehnam `lms get` might be what you're looking for: https://lmstudio.ai/docs/cli/get

@julien-c that would be nice too. Would be nice to be able to "serve on local network" take the address, put it in my mobile browser and have a chat...

@oliverbmenken What Mac OS version are you using? For those stumbling across this post later, MacOS 14.0 or newer is required for MLX Models: https://lmstudio.ai/docs/system-requirements#macos

@htsyclob there are no in-app settings for multi GPU config yet, you need to look up `CUDA_VISIBLE_DEVICES`

Should be solved by: https://github.com/lmstudio-ai/lms/pull/250

+1 Would be cool to see more tts options in llama.cpp