Implement on-disk cache during download
While 1 GiB is enough for almost any title, World War Z defies that trend by requiring up to 5.3 GiB of cache to successfully reconstruct its files. This is due to heavy duplication on the game's part since it ships with both a client and server that share large amounts of their files. The resulting heavy deduplication on Epic's part result in lots of cache being required for successful reconstruction. There are basically three ways this could be addressed:
- Optimize file processing order to discard cache sooner (i.e. grouping files that share lots of chunks)
- Throw away cache when it gets full and redownload later (inefficient and would require redownload logic)
- Add on-disk cache to temporarily store excess cache on disk during the download (annoying with mechanical drives)
Since the first two approaches are both complicated and not a cure-all/efficient solution we will have to add on-disk caching capability.
This issue is partially fixed now by implementing a simple version of No. 1. But an on-disk cache is ultimately still required.
Hi, Similar issue for "Ark: Survival Evolved". The cahce requirements is much lower though, but still causes an error.
MemoryError: Current shared memory cache is smaller than required! 1024.0 MiB < 1319.0 MiB. Try running legendary with the --enable-reordering flag to reduce memory usage.
Adding the flag doesn't change anything.
Any other suggestion?
Increase the size of the shared memory with --max-shared-memory, e.g. --max-shared-memory 1536
Perfect! Thanks for this and the fast response!
Downloading to shared memory is a bit scary, I've got 16 GB of ram and I'm just watching it fill up will waiting for Pillars of Eternity to download which apparently needs 23 GB of shared memory, it magically reduced itself to 12 GB, but that still leaves only 4 GB to run a browser to write this Github comment. This isn't how shared memory should be used surely?
[DLManager] INFO: - Cache usage: 7894.0 MiB, active tasks: 16, this is literally using up all the cache I have.
This is using the flag --max-shared-memory 23385 which is logically what I should have done given the above comment. It won't install without the flag. I did it knowing it would be a mess, and it was.
tried again, needed less shared momory than before: legendary install bcc75c246fe04e45b0c1f1c3fd52503a --enable-reordering --max-shared-memory 15587 still too much.
Yeah the developers fucked up. They essentially included multiple copies of the game so there's a lot of duplication that the legendary algorithm can't deal with (the reorder optimization would fix that, but it is disabled for games with too many files, as it gets rather slow).
What does work is downloading the game with --prefix win to ignore the duplicated files, but currently that disables the installation so the game wouldn't be launchable via legendary.
Edit: I ran the optimisation process with the limit disabled, it got the memory requirement down to less than 2 GiB, which is probably similar in size to the biggest duplicated file. Still not great that they messed up the upload though. I'll have to provide some workaround I guess.
Now for the time being I reworked the optimizer, it's now significantly faster and can handle larger file numbers.
Just as an example: The previous version took over 500 seconds (nearly 9 minutes) for Pillars of Eternity, the new version takes around 7 (on my machine).
The optimizer is also now enabled by default for Pillars. Unfortunately that still leaves it above the default limit and will require a manual increase to 2 GiB to work, at least until Obsidian or Paradox fix their uploaded version. I did not want to implement a workaround that manually adds a prefix filter, as that would have required a bit more work for something that really is only required for a single game.
That is a LOT better. You could maybe get around the manual increase of shared memory by stopping and resuming the download process once the cache reaches the specified limit. I noticed it uses a lot less shared memory the second time you start the process. It's a hacky way of doing it, and you could leave a warning that you should increase the limit. For now though a change from required 15-20 gb to 2 gb is good.
That is effectively just doing 2) in the issue description, since you're just downloading the duplicated data twice and throwing it away between duplicated files instead of keeping it in the cache.