with_advisory_lock
with_advisory_lock copied to clipboard
ActiveRecord anti-caching implementation causes out-of-memory error in JRuby
Recently, our production web environments began failing due to OOM (Out of Memory) errors. Upon analyzing the heap dumps, we noticed a large number of 33-character strings, all prefixed with a t. It turns out that this issue was caused by our use of with_advisory_lock. This method utilizes random strings generated by WithAdvisoryLock::Base#unique_column_name as aliases for the lock acquisition result column names, which are then stored in the Java::ArjdbcUtil::StringCache.
While this might be considered an issue in activerecord or activerecord-jdbc-adapter, it would be preferable to use a more gentle strategy to "Prevent AR from caching results improperly". Any thoughts?
Environment:
- rails 6.1.7
- jruby 9.3.15.0 (2.6.8) 2024-06-26 28bea01242 OpenJDK 64-Bit Server VM 25.312-b07 on 1.8.0_312-b07 +jit [arm64-darwin]
[1] https://github.com/jruby/activerecord-jdbc-adapter/blob/master/src/java/arjdbc/jdbc/RubyJdbcConnection.java [2] https://github.com/jruby/activerecord-jdbc-adapter/blob/master/src/java/arjdbc/util/StringCache.java
For anyone looking for a workaround, here's a patch that fixes the issue for PostgreSQL:
module WithAdvisoryLock
class PostgreSQL < Base
# Prevent column name aliasing from ending up in the `Java::ArjdbcUtil::StringCache` and causing out-of-memory issues
# See: https://github.com/ClosureTree/with_advisory_lock/issues/106
def execute_successful?(pg_function)
comment = lock_name.gsub(/(\/\*)|(\*\/)/, '--')
sql = "SELECT #{pg_function}(#{lock_keys.join(',')}) /* #{unique_column_name}; #{comment} */"
result = connection.select_value(sql)
# MRI returns 't', jruby returns true. YAY!
(result == 't' || result == true)
end
end
end