StackExchange.Redis
StackExchange.Redis copied to clipboard
High Memory usage from StackExchange.Redis.RawResult[]
We're seeing high memory usage in production to the point where we have to cycle the instances. Looking at memory dumps I'm seeing that StackExchange.Redis.RawResult[] is the biggest consumer of memory by far. For instance dumpheap -stat shows:
Count Memory 5738 1,316,297,200 StackExchange.Redis.RawResult[]
We are running within a docker container on an Azure App Service P1V3 instance which provides 8GB of memory but 1.3GB for RawResults seems a bit high?
Well, without detailed analysis it is hard to speculate what might be happening there, but by coincidence today I started work on an overhaul of the transport layer, so I will add this data/feedback into the factors to consider and test as that work happens, so we can try to make sure we aren't doing anything too weird in the future.
@mgravell Any pointers on what I could look for?
Not really. As I say: I'm literally tearing the guts out of that code right now, so I'm not going to propose any interim investigation or changes - the code has been "as is" for quite a few years, so I don't think a sudden set of effort right before another in-progress set of related changes is a good idea, honestly.
Hi @mgravell , Just want to follow up with regard to the above issue as described in the original question. I have the same issue whereby I have found that my services is continuously growing in memory usage, and the main culprit thus far are StackExchange.Redis.RawResult[] instances.
Any feedback and/or workarounds on this would be greatly appreciated. Thanks in advance :)
@combrinckd It's likely something is holding onto those - your best bet is a memory dump and seeing what's holding onto those array references for so long.
@ShaneCourtrille are you using JsonSerializer and creating a new instance for JsonSerializerOptions for every request? I had the same issue and I moved JsonSerializerOptions to a static private variable then the issue was resolved.
private static JsonSerializerOptions defaultSerializerSettings = new JsonSerializerOptions()
{
PropertyNamingPolicy = null,
AllowTrailingCommas = true,
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
ReferenceHandler = ReferenceHandler.IgnoreCycles
};
/// <summary>
///
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="cache"></param>
/// <param name="key"></param>
/// <param name="value"></param>
/// <param name="cancellationToken"></param>
/// <returns></returns>
public static Task SetAsync<T>(this IDistributedCache cache, string key, T value, CancellationToken cancellationToken = default)
{
return SetAsync(cache, key, value, new DistributedCacheEntryOptions(), cancellationToken);
}
/// <summary>
///
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="cache"></param>
/// <param name="key"></param>
/// <param name="value"></param>
/// <param name="options"></param>
/// <param name="cancellationToken"></param>
/// <returns></returns>
public static Task SetAsync<T>(this IDistributedCache cache, string key, T value, DistributedCacheEntryOptions options, CancellationToken cancellationToken = default)
{
var bytes = JsonSerializer.SerializeToUtf8Bytes(value, defaultSerializerSettings);
return cache.SetAsync(key, bytes, options, cancellationToken);
}
@mgravell I have the same issue, memory just goes up until the service is crushed and I am only using it to persist the data protection key, I am using it for different reasons in other services and all of them have the same issue, memory just goes up, slowly but up, I have a high amount of requests for all of the services
builder.Services.AddDataProtection() .PersistKeysToStackExchangeRedis( ConnectionMultiplexer.Connect("localhost:6379"), $"data-protection:some-service-name");
just to compare memory usage with data protection caching and without
@cholexa We can't see any of that code though - from the issue here we only know "something is being done with a connection" - which makes it really hard to speak about anything beyond that. I'd imagine references are being held onto somewhere just because you have completely linear generation graduation, bet option is to take a memory dump during anywhere on that upward line really and see how things are rooted.
@NickCraver what I am saying is, it's not even my implementation, I am using suggestions from Microsoft, to store a key in Redis
this is a package: https://www.nuget.org/packages/Microsoft.AspNetCore.DataProtection.StackExchangeRedis and this is an implementation of how the package uses the Database from ConnectionMultiplexer: https://github.dev/dotnet/aspnetcore/blob/main/src/DataProtection/StackExchangeRedis/src/RedisXmlRepository.cs#L16
@cholexa Gotcha - this would be an issue for that repo I'm pretty sure (don't have a repro here), but ultimately: memory dump saying where things are rooted is the best path for quickly seeing what's causing this.
Looking at that repo (first time hearing about this package, so all new): I see a lot of places that could be building references with escrows and such and I bet some unexpected path in there is building from one of the many singletons involved in services. If you have a memory dump and can find where things are rooted happy to help on an issue here, but right now all I can see is someone using our library that's then holding onto refs in some unexpected ways, with no expertise in that other library.
@NickCraver I have a dump file I'm analyzing with an insane amount of memory usage from RawResult[] but every one I gcroot comes back as no root. It takes awhile for each gcroot check so I'm still slogging through it to see if I can find any. These are the last 3 entries of dumpheap -stat which show where most of our memory is showing how crazy RawResult[] has gotten.
00007fd4f34f7a60 3034398 253215118 System.String 0000555b3607f310 163150 355941952 Free 00007fd4fba45990 4852 1113048800 StackExchange.Redis.RawResult[]
Running dumpheap -stat -dead though returns this..
00007fd4f3538ce0 19124 7935829 System.Byte[] 00007fd4f34f7a60 183342 10002854 System.String 00007fd4fba45990 4840 1110296000 StackExchange.Redis.RawResult[]
And running dumpgen loh returns this..
00007fd4fba45990 4852 1113048800 StackExchange.Redis.RawResult[]
So looks like it's just sitting around waiting for the garbage collector to decide that the loh needs to be collected in this latest case.
I'm thinking these arrays should probably be pooled in some manner?
@mgravell, @NickCraver Any update on this? We continue to reboot containers weekly and it sounds like others (@cholexa and @combrinckd) are having similar problems. There are two updates from @ShaneCourtrille that have gone unanswered.
I haven't had chance to look yet. I want to, but I don't have infinite time. I'll make sure it gets into our active backlog to make sure we take a look soon.
@mgravell Do you know if the size of the RawResult[] arrays (229400 in our case) correlates to the size of the data? The data being stored in Redis is from a 3rd party, so I'm wondering if the documents are just too large?
@randaratceridian @ShaneCourtrille not sure whether it helps you, but in my case, it was not about lib, some other implementation was holding the reference of the object which was using Redis and it was never disposed of, so I managed to find the root problem in the end... it was not a StackExchange.Redis
for now, I am using it and store around 100M+ keys and have around 3-4K requests per second, and the issue never appears
I would suggest looking deep into other areas which may be holding references of connection or the results
We identified a problem with too many connection multiplexer instances being created. Now that we've fixed that we're seeing more acceptable RawResult[] #'s/memory usage.
We identified a problem with too many connection multiplexer instances being created. Now that we've fixed that we're seeing more acceptable RawResult[] #'s/memory usage.
Hi @ShaneCourtrille. Can you get more info about problem fix ? I have the same leak but in ConnectionMultiplexer.OnHeartbeat
Here's a dump of memory
Large Object Heap: code that allocates a lot of memory in LOH Allocated object type: RawResult[] Last observation: 04.12.2023 16:07 dotnet Allocated size: 43,3 MB Max observation in run history: 02.12.2023 00:09 dotnet Allocated size: 135,4 MB
at GC.AllocateUninitializedArray(int, bool) at SharedArrayPool<Byte>.Rent(int) at ArrayPoolAllocator<RawResult>.Allocate(int) in //src/Pipelines.Sockets.Unofficial/Arenas/Allocator.cs:line 51 column 16 at Arena<RawResult>.AllocateAndAttachBlock(Block) in //src/Pipelines.Sockets.Unofficial/Arenas/ArenaT.cs:line 190 column 13 at Arena<RawResult>..ctor(ArenaOptions, Allocator, int) in //src/Pipelines.Sockets.Unofficial/Arenas/ArenaT.cs:line 184 column 13 at Arena<RawResult>..ctor(ArenaOptions, Allocator) in //src/Pipelines.Sockets.Unofficial/Arenas/ArenaT.cs:line 156 column 15 at PhysicalConnection..ctor(PhysicalBridge) in //src/StackExchange.Redis/PhysicalConnection.cs:line 1817 column 9 at PhysicalBridge.TryConnect(ILogger) in //src/StackExchange.Redis/PhysicalBridge.cs:line 1398 column 29 at PhysicalBridge.OnHeartbeat(bool) in //src/StackExchange.Redis/PhysicalBridge.cs:line 634 column 29 at ServerEndPoint.OnHeartbeat() in //src/StackExchange.Redis/ServerEndPoint.cs:line 768 column 21 at ConnectionMultiplexer.OnHeartbeat() in //src/StackExchange.Redis/ConnectionMultiplexer.cs:line 1070 column 21 at ConnectionMultiplexer+TimerToken+<>c.<.cctor>b__14_0(Object) in //src/StackExchange.Redis/ConnectionMultiplexer.cs:line 954 column 21 at Thread.StartCallback()