LocalAI icon indicating copy to clipboard operation
LocalAI copied to clipboard

Ability to get a list of loaded models and unload a model by request

Open Nyralei opened this issue 1 year ago • 4 comments

Nyralei avatar Aug 25 '24 13:08 Nyralei

https://github.com/mudler/LocalAI/blob/master/core/http/endpoints/localai/backend_monitor.go

These endpoints already show which backends are loaded and allow them to be unloaded?

dave-gray101 avatar Aug 25 '24 19:08 dave-gray101

https://github.com/mudler/LocalAI/blob/master/core/http/endpoints/localai/backend_monitor.go

These endpoints already show which backends are loaded and allow them to be unloaded?

Thanks for pointing out /backend/shutdown, but it works only when model ends with .bin https://github.com/mudler/LocalAI/blob/master/core/services/backend_monitor.go#L42 otherwise it tries to append ".bin" to model name. In my case model name ends with ".gguf"

About /backend/monitor endpoint - it doesn't show which models are currently loaded, it just shows some metrics and only if model is set in request (ending with ".bin" too). I tried calling both with

{
    "model": "ggml-whisper-large-v3.bin"
}
  1. /backend/monitor responds with
{
    "state": 1,
    "memory": {
        "total": 53627682816,
        "breakdown": {
            "gopsutil-RSS": 681861120
        }
    }
}
  1. /backend/shutdown properly shut downs

With "model": "gemma-2-27b-it-Q5_K_S.gguf":

{
    "error": {
        "code": 500,
        "message": "backend gemma-2-27b-it-Q5_K_S.gguf.bin is not currently loaded",
        "type": ""
    }
}
{
    "error": {
        "code": 500,
        "message": "model gemma-2-27b-it-Q5_K_S.gguf.bin not found",
        "type": ""
    }
}

Nyralei avatar Aug 25 '24 19:08 Nyralei

Thanks for the updated comment!

That sounds like a big bug to me - I'll see if I can investigate this soon.

dave-gray101 avatar Aug 25 '24 20:08 dave-gray101

Seems that it shows status of a particular requested model https://github.com/mudler/LocalAI/blob/master/core/http/endpoints/localai/backend_monitor.go#L23C34-L23C39 maybe for all models it should be a separate endpoint or something like '*' instead of model name?

jokerosky avatar Sep 12 '24 14:09 jokerosky

This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar Jul 21 '25 02:07 github-actions[bot]

This issue was closed because it has been stalled for 5 days with no activity.

github-actions[bot] avatar Jul 27 '25 02:07 github-actions[bot]