Retrying within rescue will chain errors and fail to send to Sentry
Issue Description
I have a method that will try to perform an operation, and if it fails it will attempt it again within the rescue block, up to 20 times, then raises an exception.
What ends up happening is that sentry will end up capturing the chain of 20 exceptions, and each exception will include the entire stack trace when it happened, so it will generate an enormous stack trace. The SDK will then try to crop the stack trace, but fail because while cutting each exception down to 250 frames, we are still left with a total of over 1000 frames and the envelope will be too large to send.
Reproduction Steps
class Chainer
def chain_errors(n=20)
raise StandardError.new("Been launched #{n} times")
rescue StandardError => e
if n == 0
Sentry.capture_exception(e)
else
chain_errors(n - 1)
end
end
end
Then I launch a ruby console and call Chainer.new.chain_errors, which fails and Chainer.new.chain_errors 2, which succeeds
Expected Behavior
I would expect Sentry to perform some sort of logging, either by removing most of the exceptions or by cropping all stack traces further so the error would end up on sentry (maybe this could be a setting). I would also expect that, if that fails, it would throw an error that explains the problem better
Actual Behavior
Sentry will give up after removing the breadcrumbs and cropping the stack traces, and throw a message without much information on the error into the rails logs: Envelope item [event] is still oversized after size reduction: {event_id: 34, level: 7, timestamp: 22, release: 48, environment: 12, server_name: 14, modules: 11411, message: 2, user: 2, tags: 52, contexts: 993, extra: 2, fingerprint: 2, transaction: 52, transaction_info: 17, platform: 6, sdk: 40, type: 7, threads: 94, exception: 283677}
Ruby Version
2.7.5
SDK Version
5.5.0
Integration and Its Version
No response
Sentry Config
No response
thx @joromero we generally need some love on better truncation logic, will try to fit in this with the general plan.