llama-cpp-python icon indicating copy to clipboard operation
llama-cpp-python copied to clipboard

MiniCPM-V 2.6 memory leak occurred !!!

Open Liwx1014 opened this issue 11 months ago • 5 comments

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • [ ] I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • [ ✅] I carefully followed the README.md.
  • [ ✅] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • [ ✅] I reviewed the Discussions, and have a new bug or useful enhancement to share.

Expected Behavior

MiniCPM-V 2.6 memory leak ?

Current Behavior

I am testing minicpm-v-2.6 ,here is part of my test code: 37a5d00121d9619b47c64c17b35131b

when i run code ,it adds 10MB per inference,i use memory_profiler find in https://github.com/abetlen/llama-cpp-python/blob/2bc1d97c9672320828e70dc8293d5f8754682109/llama_cpp/llama_chat_format.py#L2839

a6879c9732cf5cd576859e126b66faf

I thought at first that this variable "embed" wasn't being released,I added the code manual release on line 2856,but the program reported an error. then I replace "embed" with "self._last_image_embed",Memory leaks continue to occur.

Environment and Context

Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.then

Example environment info:


Python 3.10.10
ubuntu 18.04
cudatoolkit 12.1
llama-cpp-python 0.2.90

Liwx1014 avatar Dec 30 '24 07:12 Liwx1014