LocalAI icon indicating copy to clipboard operation
LocalAI copied to clipboard

fix: gpu fetch device info

Open sozercan opened this issue 1 year ago • 3 comments

Description

This PR fixes #2401

It seems PCI device info from the ghw library relies on local filesystem by default. I am not sure why this isn't working properly inside a container. This PR allows fetching device info from the network. https://github.com/jaypipes/pcidb/blob/d9773c605ac44c478e0ee7e322f31eaa32615010/README.md?plain=1#L21-L26

Notes for Reviewers

I have not tested this with LocalAI container but this works for AIKit https://github.com/sozercan/aikit/actions/runs/9231349393/job/25401086746#step:12:11

Signed commits

  • [x] Yes, I signed my commits.

sozercan avatar May 25 '24 00:05 sozercan

Deploy Preview for localai canceled.

Name Link
Latest commit a925e8834b3c01092e3573d53dd4a61763736272
Latest deploy log https://app.netlify.com/sites/localai/deploys/665132c606ebf400088aec2d

netlify[bot] avatar May 25 '24 00:05 netlify[bot]

Deploy Preview for localai canceled.

Name Link
Latest commit 71724ed40d10bdfb1f2c2b09aee1a9a4399a8014
Latest deploy log https://app.netlify.com/sites/localai/deploys/6652655d25322a00082a985d

netlify[bot] avatar May 25 '24 00:05 netlify[bot]

I wonder if we really need to detect GPU on containers with CUDA as we do build containers directly with the BUILD_TYPE=cublas: all the binaries produced should already be ready to offload to GPU, and actually the llama-cpp-cuda binary should be even missing.

Besides, that would leave airgap environment in the cold - what would happen if there is no network?

mudler avatar May 25 '24 07:05 mudler

Found a better way to handle this with pciutils package

sozercan avatar May 25 '24 22:05 sozercan

Found a better way to handle this with pciutils package

nice! good catch @sozercan !

mudler avatar May 26 '24 07:05 mudler