whisper.cpp
whisper.cpp copied to clipboard
iOS App loading model memory usage issue
When I use a model to run on iOS, the app's memory usage becomes as large as expected. For an app that needs to run for a long time, a large model increases the OOM probability, although I have already used models below 100M.
May I ask if there are any ways to reduce memory usage? For example, what parameters or models can I configure to reduce memory usage without affecting the results as much as possible? Thank you very much!
Another Demo's issue
First init whisper time is too long , please try async to load it.
dispatch_queue_t globalConcurrentQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0); dispatch_async(globalConcurrentQueue, ^{ [self initWhisper: [index intValue]]; });
I also encounted the same question, the memory usage increase as the whisper.cpp transcribes in ios simulator, I havn't test this on ios device, any tips for this?