[Bug] Exception thrown when serializing NameError exceptions using Psych
Steps to reproduce:
begin
abc
rescue Exception => ex
converter = AWS::Flow::YAMLDataConverter.new
converter.load(converter.dump(ex))
end
Output:
TypeError: allocator undefined for NameError::message
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/to_ruby.rb:303:in `allocate'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/to_ruby.rb:303:in `revive'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/to_ruby.rb:178:in `visit_Psych_Nodes_Mapping'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/visitor.rb:15:in `visit'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/visitor.rb:5:in `accept'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/to_ruby.rb:20:in `accept'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/to_ruby.rb:210:in `block in visit_Psych_Nodes_Mapping'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/to_ruby.rb:210:in `map'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/to_ruby.rb:210:in `visit_Psych_Nodes_Mapping'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/visitor.rb:15:in `visit'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/visitor.rb:5:in `accept'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/to_ruby.rb:20:in `accept'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/to_ruby.rb:240:in `visit_Psych_Nodes_Document'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/visitor.rb:15:in `visit'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/visitor.rb:5:in `accept'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/visitors/to_ruby.rb:20:in `accept'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych/nodes/node.rb:35:in `to_ruby'
from /home/arkadiy/.rvm/rubies/ruby-2.0.0-p353/lib/ruby/2.0.0/psych.rb:130:in `load'
from /home/arkadiy/code/cardspring/core/vendor/bundle/ruby/2.0.0/gems/aws-flow-1.0.5/lib/aws/decider/data_converter.rb:30:in `load'
from (irb):5:in `rescue in irb_binding'
from (irb):1
This is actually a bug in Psych (see https://github.com/tenderlove/psych/issues/85), but since it's been open for more than a year it would be nice if the flows gem worked around it
Are there any known workarounds? I'm a little leery special casing a fix via say, replacing NameError::message with a string, as it may then make us unable to take a fix for https://github.com/tenderlove/psych/issues/85 without a breaking change. The most recent comment on the issue seems to suggest there is no workaround for more recent versions of ruby(=~1.9.3p362).
After playing around with it the only solutions I was able to get working were:
-
Replacing NameError::message with a string as you suggested
-
Creating a copy of the exception - this forces the message object on the exception to actually be a string. Essentially a safer version of 1)
begin
abc
rescue Exception => ex
ex_copy = ex.class.new(ex.message)
ex_copy.set_backtrace(ex.backtrace)
converter = AWS::Flow::YAMLDataConverter.new
converter.load(converter.dump(ex_copy)) # No exception
end
- From https://github.com/JoshCheek/seeing_is_believing/issues/11#issuecomment-22396617, his workaround was to serialize the exception message and backtrace separately:
class YAMLDataConverter
def dump(object)
# ...
return YAML.dump_stream(object.class.name, object.message, object.backtrace)
# ...
end
def load(source)
# ...
klass, message, backtrace = YAML.load_stream(source)
exception = get_const(klass).new(message)
exception.set_backtrace(backtrace)
# ...
end
end
The downside of that is it would require more special handling to take care of exceptions which have other attributes (like reason & details)
We have a couple possible candidate fixes to this, one of which will hopefully go out soon. Just letting you know we're still working on fixing this.
Great, thanks for the update!
Any update on this issue?