Russ Jones
Russ Jones
@yinqiwen updated to latest version but there was no improvement. Still had the memory leak. The DB has A LOT of keys/hashes. Each hash has probably 48 values in it....
@yinqiwen I tried switching to hGetAll to see if perhaps just getting all the raw data would be better and less overhead, but no luck.
I tried setting redis-compatible-mode to yes as I read in some other threads that this could cause memory issues. Unfortunately that did not do the trick, if anything it hastened...
First, thank you so much for your help! 1. I removed the pipelining and still had the problem. 2. The keys are random 3. I am using the latest from...
Woohoo! Your changes seemed to have paused the effort at somewhere around 58% memory! Here is the info all output... =======================
# Server ardb_version:0.10.0 redis_version:2.8.0 engine:rocksdb ardb_home:/install os:Linux 3.13.0-98-generic x86_64 gcc_version:4.8.4 process_id:18207 run_id:2bc2903a5bf449dc3eab36ab947a185db400a24b tcp_port:6379 listen:0.0.0.0:6379 uptime_in_seconds:24 uptime_in_days:0 executable:/install/ardb/src/ardb-server config_file:/install/ardb/ardb.conf # Databases data_dir:/data/rocksdb used_disk_space:107968311059 living_iterator_num:0 rocksdb_version:5.14.2 rocksdb.block_table_usage:0 rocksdb.block_table_pinned_usage:0 rocksdb_memtable_total:117442128 rocksdb_memtable_unflushed:117442128 rocksdb_table_readers_total:0 rocksdb.estimate-table-readers-mem:0...