Memory Leak in Large Dataset Processing
Description:
When processing large iOS extractions (especially with many photos/videos), iLEAPP can consume excessive memory and potentially crash. This is particularly evident in media-heavy artifacts like photosMetadata.py and BeReal.py.
Technical Details:
- Memory usage grows linearly with dataset size without proper cleanup
- Large SQLite databases are loaded entirely into memory
- Media file processing doesn't use streaming/chunked processing
- No memory usage monitoring or limits
Evidence from Code:
### In photosMetadata.py and similar modules
all_rows = cursor.fetchall() ### Loads entire dataset into memory
Hi @Zedoman! It's been a habit to write artifacts that interact with sqlite databases using cursor.fecthall(), but we can also iterate over the cursor directly. For example, on photosMedata lines 277 and onwards look like this today:
all_rows = cursor.fetchall()
usageentries = len(all_rows)
data_list = []
counter = 0
if usageentries > 0:
for row in all_rows:
And if we changed the idiom to directly iterate over the cursor, we'd write it like this:
data_list = []
counter = 0
for row in all_rows:
If you can change the code, give it a try and see if it fixes the issue you're experiencing running out of memory. I wouldn't call it a memory leak, because after the execution the garbage collector runs and cleans up, but it is a (massive?) upfront allocation that can overwhelm your memory with lots of data at once.
Anyway, if you've found other artifacts that are giving you trouble with this, can you give this fix a try and tell us about it? We're open to feedback and even a pull-request with this, if it effectively solves your issue.
Ya sure sir I will check into this once and update and also I will raise a PR if fixed.
@bconstanzo I fixed it [https://github.com/abrignoni/iLEAPP/pull/1318]