audapolis icon indicating copy to clipboard operation
audapolis copied to clipboard

high memory usage after transcribing is done

Open HorayNarea opened this issue 1 year ago • 2 comments

When audapolis loads the language model it obviously has to use a lot of memory, but when it's done transcribing it looks like the model is remaining in memory:

image

Could this be unloaded (or is the high memory usage something else)?

HorayNarea avatar Apr 28 '23 19:04 HorayNarea

Should be fairly easy by removing the model caching in https://github.com/audapolis/audapolis/blob/ff91a2c23c31a8179c3d12f348eb91431d0dfb2b/server/app/models.py#L119-L126 Feel free to open a PR, else we'll so it once we get to it

pajowu avatar Apr 28 '23 19:04 pajowu

Some more explanation on the current behaviour: We thought about how we want to handle model loading. Since the model loading take a while, we said we didn't want to re-load it every time you open a new document. Since you might want to transcript a number of smaller files, so model loading will make up a good portion of the transcription time.

However we fully agree that it's not reasonable behaviour for the model to stay in memory forever. I would be fine with just removing it once the transcription is finished (which should also be very easy to implement). But I think a good compromise could also be to evict the model from memory after a certain amount of time (5 minutes? 15 minutes).

In conclusion: If you want to implement it quickly, just remove the caching code. If you want to spend some more time, feel free to explore other options

pajowu avatar Apr 28 '23 20:04 pajowu