unicorn
unicorn copied to clipboard
Rust bindings leak emulators
Steps to reproduce the bug:
- Create an emulator
- Add a hook to it
- Drop the emulator (just let it go out of scope)
Expected behaviour: all resources related to the emulator are freed.
Actual behaviour: the emulator is not freed.
The following code showcases the problem:
use std::thread;
use std::time::Duration;
use unicorn_engine::unicorn_const::{Arch, Mode, Permission};
use unicorn_engine::Unicorn;
pub fn code_hook(_: &mut Unicorn<()>, _: u64, _: u32) {}
fn main() {
const MEM_SIZE: usize = 0x10000000; // 256 MB
const PAGE_SIZE: usize = 0x1000;
const START_ADDRESS: u64 = 0x10000;
for i in 0..0x100 {
println!("{i}");
let mut emu = unicorn_engine::Unicorn::new(Arch::X86, Mode::MODE_64)
.expect("failed to initialize Unicorn instance");
emu.mem_map(START_ADDRESS, MEM_SIZE, Permission::ALL)
.expect("failed to map code page");
// Touch all pages to actually map the memory
for address in (START_ADDRESS..START_ADDRESS + MEM_SIZE as u64).step_by(PAGE_SIZE) {
emu.mem_write(address, &[0xcc]).unwrap();
}
emu.add_code_hook(0, u64::MAX, code_hook).unwrap();
thread::sleep(Duration::from_secs(1));
// emu is dropped here
}
}
At each point in time, only one emulator should exist, so the memory usage should stay constant, around MEM_SIZE
. Instead you will see that after each second, the memory usage increases by MEM_SIZE
, indicating that the emulator is not freed.
The root cause of the problem is that the UnicornInner
does not get dropped when Unicorn
is dropped. This happens because inner
is an Rc
, which only gets dropped if its reference count is exactly one.
https://github.com/unicorn-engine/unicorn/blob/ca81d46ad53e2f324780bbf4e60f49483cb98fef/bindings/rust/src/lib.rs#L131-L155
However, the add_code_hook
function clones inner
, increasing the reference count. This introduces a circluar reference, because the inner
holds a reference to the hooks, but the hooks also hold a reference to inner
; hence inner
will never get dropped.
https://github.com/unicorn-engine/unicorn/blob/ca81d46ad53e2f324780bbf4e60f49483cb98fef/bindings/rust/src/lib.rs#L626
The same issue holds for all add_*_hook
functions, as well as mmio_map
.
Acknowledged, but I'm not rust expert and not confident enough to post a fix. Could you draft a PR?
@domenukk can you fix it?
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 15 days.
Yeah also noticed this, with next code it leaks roughly 1GB at a time.
use unicorn_engine::unicorn_const as ucc;
use unicorn_engine::Unicorn;
fn leaks() {
let uni = Unicorn::new(ucc::Arch::X86, ucc::Mode::MODE_64);
if uni.is_err() {
println!("Unable to create unicorn instance");
return;
}
let mut emu = uni.unwrap();
emu.add_mem_hook(
ucc::HookType::MEM_UNMAPPED,
0,
u64::MAX,
|_uc, _access, _addr, _size, _value| {
true
},
).unwrap();
}
fn main() {
for i in 0..30 {
leaks();
println!("Iteration {}, check ram usage...", i);
// sleep for 1 second
std::thread::sleep(std::time::Duration::from_secs(1));
}
}
Yeah also noticed this, with next code it leaks roughly 1GB at a time.
use unicorn_engine::unicorn_const as ucc; use unicorn_engine::Unicorn; fn leaks() { let uni = Unicorn::new(ucc::Arch::X86, ucc::Mode::MODE_64); if uni.is_err() { println!("Unable to create unicorn instance"); return; } let mut emu = uni.unwrap(); emu.add_mem_hook( ucc::HookType::MEM_UNMAPPED, 0, u64::MAX, |_uc, _access, _addr, _size, _value| { true }, ).unwrap(); } fn main() { for i in 0..30 { leaks(); println!("Iteration {}, check ram usage...", i); // sleep for 1 second std::thread::sleep(std::time::Duration::from_secs(1)); } }
Actually remove_hook()
solves the problem for me, could you also try @nemarci ?
Yeah also noticed this, with next code it leaks roughly 1GB at a time.
use unicorn_engine::unicorn_const as ucc; use unicorn_engine::Unicorn; fn leaks() { let uni = Unicorn::new(ucc::Arch::X86, ucc::Mode::MODE_64); if uni.is_err() { println!("Unable to create unicorn instance"); return; } let mut emu = uni.unwrap(); emu.add_mem_hook( ucc::HookType::MEM_UNMAPPED, 0, u64::MAX, |_uc, _access, _addr, _size, _value| { true }, ).unwrap(); } fn main() { for i in 0..30 { leaks(); println!("Iteration {}, check ram usage...", i); // sleep for 1 second std::thread::sleep(std::time::Duration::from_secs(1)); } }
[  ](https://user-images.githubusercontent.com/36543551/188592558-161c1776-192b-4a72-96cd-00c2012a49a4.gif) [ ](https://user-images.githubusercontent.com/36543551/188592558-161c1776-192b-4a72-96cd-00c2012a49a4.gif) [  ](https://user-images.githubusercontent.com/36543551/188592558-161c1776-192b-4a72-96cd-00c2012a49a4.gif) [  ](https://user-images.githubusercontent.com/36543551/188592558-161c1776-192b-4a72-96cd-00c2012a49a4.gif) [ ](https://user-images.githubusercontent.com/36543551/188592558-161c1776-192b-4a72-96cd-00c2012a49a4.gif) [ ](https://user-images.githubusercontent.com/36543551/188592558-161c1776-192b-4a72-96cd-00c2012a49a4.gif)
Actually
remove_hook()
solves the problem for me, could you also try @nemarci ?
You're right, explicitly removing all hooks before dropping the emulator solves the issue. However, I think this is rather a workaround than a proper solution.
@expend20 @nemarci Could you check if https://github.com/unicorn-engine/unicorn/pull/1632 solves this problem? If so, you can Approve
this PR.
hey @bet4it, thanks for the fix. I can confirm it's working (not leaking the memory on my sample)
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 15 days.
Not fixed yet
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 15 days.
Just hit this problem, found this issue, switched to the dev branch, and it seems to have gone away. Thanks!