ObjC analysis crashes on AwemeCore binary
Discussed in https://github.com/Vector35/binaryninja-api/discussions/5656
Originally posted by dawn-breaking June 23, 2024 bininninjia commercial stable latest version save AwemeCore.bndb crashes on completion of analysis。latest commercial dev version load AwemeCore also crashed。example binary file download URL:https://dawnvip.oss-cn-shenzhen.aliyuncs.com/AwemeCore. os version : ubuntu22.04, memory total size 256G, with “full” for analysis mode
Crash info
This is likely due to the ObjC analysis, I was able to produce a crash related to deallocating an already deallocated AnalysisInfo. This is not trivially reproducible, if it crashes at all it crashes almost immediately after starting "Phase 1" analysis, sometimes it does not crash at all, which makes sense for this double free.
12 libworkflow_objc.dylib 0x122a481c4 std::__1::unordered_map<unsigned long long, std::__1::vector<unsigned long long, std::__1::allocator<unsigned long long>>, std::__1::hash<unsigned long long>, std::__1::equal_to<unsigned long long>, std::__1::allocator<std::__1::pair<unsigned long long const, std::__1::vector<unsigned long long, std::__1::allocator<unsigned long long>>>>>::~unordered_map[abi:ue170006]() + 28 (unordered_map:1256) [inlined]
13 libworkflow_objc.dylib 0x122a481c4 AnalysisInfo::~AnalysisInfo() + 128 (GlobalState.h:24) [inlined]
14 libworkflow_objc.dylib 0x122a481c4 AnalysisInfo::~AnalysisInfo() + 128 (GlobalState.h:24) [inlined]
15 libworkflow_objc.dylib 0x122a481c4 std::__1::__shared_ptr_emplace<AnalysisInfo, std::__1::allocator<AnalysisInfo>>::__on_zero_shared() + 148 (shared_ptr.h:324)
16 libworkflow_objc.dylib 0x122a4d4bc std::__1::__shared_count::__release_shared[abi:ue170006]() + 32 (shared_ptr.h:173) [inlined]
17 libworkflow_objc.dylib 0x122a4d4bc std::__1::__shared_weak_count::__release_shared[abi:ue170006]() + 32 (shared_ptr.h:214) [inlined]
18 libworkflow_objc.dylib 0x122a4d4bc std::__1::shared_ptr<AnalysisInfo>::~shared_ptr[abi:ue170006]() + 40 (shared_ptr.h:773) [inlined]
19 libworkflow_objc.dylib 0x122a4d4bc std::__1::shared_ptr<AnalysisInfo>::~shared_ptr[abi:ue170006]() + 40 (shared_ptr.h:771) [inlined]
20 libworkflow_objc.dylib 0x122a4d4bc Workflow::inlineMethodCalls(BinaryNinja::Ref<BinaryNinja::AnalysisContext>) + 1200 (Workflow.cpp:217)
@dawn-breaking Hi I went ahead and made an issue regarding your crash, I have been trying to get the issue to reproduce locally and I was wondering if you have a stack trace.
We have also had a few reports of new intel CPU's being unstable, specifically the i9-13900K/14900K, so that would also be useful information for us if you know the CPU you are running. (Not to say that is your issue, it likely is not, considering you are running Ubuntu and no reports indicate that instability effects Linux)
Hi CPU is AMD 7975wx crash info: Segmentation fault (core dumped) ; free(): invalid pointer Aborted (core dumped). when “basic” for analysis mode, latest stable or dev version works fine.
FYI, I had an M3 MBP running all night on this and it's still going. No crash yet, but analysis still isn't done. (Not using basic analysis).
Do you have the full stack trace of the crash showing pointer offsets?
Did you have any other Objective-C files open and analyzing when the crash occurred?
FYI, I had an M3 MBP running all night on this and it's still going. No crash yet, but analysis still isn't done. (Not using basic analysis).
Do you have the full stack trace of the crash showing pointer offsets?
1)I also have M3 MBP, the same file works fine with full analysis on it.
2)stack trace: Thread 106 "W T A core.func" received signal SIGSEGV, Segmentation fault. [Switching to Thread 0x75b3c036d640 (LWP 1479292)] 0x000075b9aac8d993 in std::_Sp_counted_ptr_inplace<AnalysisInfo, std::allocator<AnalysisInfo>, (__gnu_cxx::_Lock_policy)2>::_M_dispose() () from ~/binaryninja/plugins/libworkflow_objc.so
Did you have any other Objective-C files open and analyzing when the crash occurred?
no other Objective-C files crashed, only this file crashed on ubuntu with full analysis.
Thanks for reporting this!
Resolved in dev builds >=4.1.5667
Analysis of Aweme in Windows 10 still has the problem of crashing. The crash occurs in the first stage when the memory surges.
system: windows10 binary ninja: last dev version
How much memory do you have on the machine you're using @c0618? This is a very lage file so one potential source of crashes are OOMs. Alternatively, can you run it under a debug or capture the stack track of the crash so we can confirm if it's the same issue?
How much memory do you have on the machine you're using @c0618? This is a very lage file so one potential source of crashes are OOMs. Alternatively, can you run it under a debug or capture the stack track of the crash so we can confirm if it's the same issue?
@psifertex My system basic information: cpu 13700k memory 64G, I can't locate the crash stack information, the process suddenly closed. Rolling back to the old version 4.0 release will not cause this problem, but when using the 4.0 version ObjC analysis plugin, a crash will occur.
4.0 doesn't automatically run the ObjectiveC workflow, are you manually enabling it there? 64gb of RAM is quite possibly not enough for a file of this size. Having it suddenly terminate is consistent with an OOM condition.
4.0 doesn't automatically run the ObjectiveC workflow, are you manually enabling it there? 64gb of RAM is quite possibly not enough for a file of this size. Having it suddenly terminate is consistent with an OOM condition.
@psifertex Yes, I was waiting for the 4.0 release. After the analysis was completed, I clicked the ObjC plugin to analyze and it crashed. I really like the binary ninja tool. The analysis is very detailed, but the memory consumption is really too high. I hope to optimize the relevant data structures to solve this problem. It is a very bad thing to use 'binary ninja' to analyze > 100M mach-o files
Do you mean you used open with options and selected the objective c workflow on 4.0? Sorry, not sure I understand your previous reply.
There are several other settings that can help with memory utilization as well, we've started a section of the documentation to cover this, though it's fairly small now:
https://docs.binary.ninja/guide/troubleshooting.html#working-with-large-or-complex-binaries
Even if the 4.1 changes have introduced more memory utilization that's really a distinct issue from this previous one even though it's the same file. In fact, the fact that the previous reporter doesn't have an issue with 4x the available memory leads me to believe this is indeed a separate issue.
Another suggestion -- try creating a swap file to see if that alleviates the issue of additional memory pressure