grpc-go
grpc-go copied to clipboard
For unary methods, received messages are put into buffer from the pool, but buffer is never returned back to pool.
What version of gRPC are you using?
1.57.0
What version of Go are you using (go version
)?
1.20.6
What operating system (Linux, Windows, …) and version?
MacOS
What did you do?
added buffer pool support and I expect that the number of allocations in the application will decrease
What did you expect to see?
I expect to see a decrease in allocations to the query (processUnaryRPC -> recvAndDecompress -> recvMsg)
What did you see instead?
after adding the buffer pool, the number of allocations has not changed (objects are not returned to the pool anywhere else, but they are always taken from it 616 line figure 2)
figure 1:
figure 2:
after adding the buffer pool
How did you do that?
The buffer is retrieved from the pool here and is put back to the pool here.
@hueypark : Do you see any issues at all with the implementation? I remember that you wanted this functionality quite urgently. Have you been using this buffer pool with your application? How has your experience been? Thanks.
How did you do that?
I added a SharedBufferPool to my server and set breakpoints in the Get and Put methods. When debugging, the application stopped in the Get method, but never stopped in the Put method. Then I implemented my Buffer Pool and made a structure that would satisfy SharedBufferPool and also made two breakpoints in Get and Put. The result was repeated - Put is never called.
Please try to do the same and call any unary methods, which you can write yourself. If there are problems, I can share my repository, which you can debug.
Judging by Figure 1, I am not the only one who has faced such a problem.
@pstibrany have you also encountered such a problem? can you describe it? thank you!
@easwars this is where the buffer pool is passed, from which the buffer is taken, and then nothing is returned back to the pool (Put).
I can assume that after calling this function, you can return the buffer back to the pool
@easwars
@hueypark : Do you see any issues at all with the implementation? I remember that you wanted this functionality quite urgently. Have you been using this buffer pool with your application? How has your experience been? Thanks.
I had already solved the memory problem by reducing the number of RPC calls, so I hadn't introduced a buffer pool yet. And when I was developing this PR, I was mainly thinking about stream RPCs, so it might not work well with unary calls.
+1 for https://github.com/grpc/grpc-go/pull/5862#issuecomment-1582154054
@hueypark
...I was mainly thinking about stream RPCs, so it might not work well with unary calls.
you can add your own pool that will permanently hold memory, so it will still work well, unlike sync.Pool. The problem is that now nothing is returned back to the pool with unary calls, regardless of which pool it is: SharedBufferPool (provided by developers) or self-written
@efremovmi
you can add your own pool that will permanently hold memory, so it will still work well, unlike sync.Pool. The problem is that now nothing is returned back to the pool with unary calls, regardless of which pool it is: SharedBufferPool (provided by developers) or self-written
I agree. Unfortunately, I won't be able to allocate time for this improvement in the immediate future. Would you consider creating a pull request based on your current findings?
@hueypark
I agree. Unfortunately, I won't be able to allocate time for this improvement in the immediate future. Would you consider creating a pull request based on your current findings?
Yes, I will try to do it.
Added a change: https://github.com/grpc/grpc-go/pull/6605
Seems like #6605 might be closed soon due to inactivity. @hueypark Are you interested in taking this up since you made the initial PR if you have some bandwidth?
Sure, I'll take a look and pick it up.
any news ? i want to minimise memory allocs, but i have unary and stream rpc
I approved #6766; we need a second review before merging it. Sorry for the delays.
Hi! Have you noticed any changes in memory consumption after merging with the main branch?
@efremovmi The gRPC benchmarks (https://grafana-dot-grpc-testing.appspot.com/?orgId=1) unfortunately don't collect memory usage statistics. Maybe another user has some data they can share?
In this case, I will try to create a load test for my pet project in my free time and come back with the results