`numericality` validator returns valid?` as `true` regardless of value if a `default` option is provided
When using store_attribute_register_attributes = true and a
numericality validation, we run into a bug where the default option we
pass to attributes overrides the actual value we get from our store.
Below is an example where the bug exposed itself:
class Debug < ApplicationRecord
self.store_attribute_register_attributes = true
store_attribute :data, :num, :integer, default: 10
validates :num,
numericality: { only_integer: true, greater_than_or_equal_to: 0 }
end
And in a console, regardless of the value of num, the record would
always say that it was valid:
dev@(main)> a = Debug.new
:id => nil,
:data => {
"num" => 10
},
:created_at => nil,
:updated_at => nil,
:num => 10
}
dev@(main)> a.num = -1
-1
dev@(main)> a.valid?
true
dev@(main)> a.save!
true
dev@(main)> a_from_db = Debug.find(a.id)
:id => 1,
:data => {
"num" => -1
},
:created_at => Sat, 26 Jul 2025 00:29:38.602588000 UTC +00:00,
:updated_at => Sat, 26 Jul 2025 00:29:38.602588000 UTC +00:00,
:num => -1
}
dev@(main)> a_from_db.valid?
true
dev@(main)> a_from_db.attributes
{
"id" => 1,
"data" => {
"num" => -1
},
"created_at" => Sat, 26 Jul 2025 00:29:38.602588000 UTC +00:00,
"updated_at" => Sat, 26 Jul 2025 00:29:38.602588000 UTC +00:00,
"num" => 10
}
dev@(main)> a.attributes
{
"id" => 1,
"data" => {
"num" => -1
},
"created_at" => Sat, 26 Jul 2025 00:29:38.602588000 UTC +00:00,
"updated_at" => Sat, 26 Jul 2025 00:29:38.602588000 UTC +00:00,
"num" => 10
}
dev@(main)>
As you can see from the output, there's a discordance with what the
object instance thinks it has (which is coming from store_accessors),
and what is saved in @attributes. I believe the reason is that
attributes kicks off it's own default handling logic before
store_accessors does, so it's saved value ends up being the default.
This only seems to be an issue with the numericality validator as it's
doing some funny stuff with the value it's trying to validate (rather
than just checking the value directly):
def prepare_value_for_validation(value, record, attr_name)
return value if record_attribute_changed_in_place?(record, attr_name)
came_from_user = :"#{attr_name}_came_from_user?"
if record.respond_to?(came_from_user)
if record.public_send(came_from_user)
raw_value = record.public_send(:"#{attr_name}_before_type_cast")
elsif record.respond_to?(:read_attribute)
raw_value = record.read_attribute(attr_name)
end
else
before_type_cast = :"#{attr_name}_before_type_cast"
if record.respond_to?(before_type_cast)
raw_value = record.public_send(before_type_cast)
end
end
raw_value || value
end
When calling valid? - the above function has the correct value in
value (in this case, it would be -1), but instead of falling back to
using that to validate, it ends up calling
record.public_send(:"#{attr_name}_before_type_cast")
which returns the default of 10, which ultimately gets returned and
that's what we end up validating against.