darktable icon indicating copy to clipboard operation
darktable copied to clipboard

keep thumbnails on disk for only the last ≈N visited images

Open tpapp opened this issue 1 year ago • 2 comments

Is your feature request related to a problem? Please describe.

High-resolution thumbnails are very useful, eg for culling images. As RAM is usually a limiting factor, Darktable's ability to save thumbnails to disk is a great feature that speeds up processing. One downside of the current implementation is that thumbnails are not deleted from the cache automatically, so the user either deletes the thumbnail cache from time to time, or accepts that a lot of disk space is consumed.

Describe the solution you'd like

It would be great to have an option that makes Darktable keep the thumbnails for the most recent N visited images, with a default like N = 1000 or similar. This is sufficient for most use cases, yet would save a lot of disk space for large collections.

The implementation can be approximate, eg only remove items from the cache when the number of images grows above some threshold N + T. T would not need to be exposed in the preferences GUI.

Alternatives

Manual cleaning is available currently, but it is not very convenient. This feature would automate it.

Additional context

See discussion where it was suggested that I open an issue.

tpapp avatar Aug 22 '24 17:08 tpapp

What should be considered as a visited image?

  • opened in darkroom?
  • included in a collection opened in lighttable?
  • selected image in lighttable?
  • accessed during culling?
  • had metadata applied to it?
  • a whole bunch more I haven't thought of.....

Would a time based cleaning work, i.e. remove the cache for

  • any collection with a creation date older then N days?
  • any thumbnail older than N days?

darktable doesn't track when images were accessed. It tracks when they were imported, changed, exported, and printed.

wpferguson avatar Aug 22 '24 18:08 wpferguson

I was thinking a dual limit of X days or delete the cache of older imports to keep the size below XXGb. The latter one sounds more ideal, but maybe more complex to track/implement.

gi-man avatar Aug 22 '24 20:08 gi-man

This issue has been marked as stale due to inactivity for the last 60 days. It will be automatically closed in 300 days if no update occurs. Please check if the master branch has fixed it and report again or close the issue.

github-actions[bot] avatar Oct 22 '24 00:10 github-actions[bot]

This issue was closed because it has been inactive for 300 days since being marked as stale. Please check if the newest release or nightly build has it fixed. Please, create a new issue if the issue is not fixed.

github-actions[bot] avatar Aug 18 '25 00:08 github-actions[bot]

@tpapp is this still an issue?

After reading the discussion that was referenced, I have a better understanding of the whole problem.

Problem 1 - cache generation

The tools available for generating cache generate all sizes for all images, which takes up a LOT of disk space and gives you way more than you want, which led to Problem 2.

Problem 2 - get rid of cache because I'm out of disk space

i.e. this PR


My solution for me:

My requirements

  • only generate the cache I need. I have a 4K screen so that's size 3 for thumbnails and size 6 for full preview. I don't use filmstrip, so I don't need small thumbnails. I seldom middle click to zoom to 100% while culling so I can wait a second or 2 for that image to be created versus the amount of time it would take for full size cache to be generated.
  • generate the cache as soon as I import the images. Also allow me to find an old film roll and generate cache on demand.
  • when I process an image, update the lighttable thumbnail when I change to the next image. I tend to edit images one after the other using the space bar to move between images. I hate returning to lighttable and then having to wait while the thumbnail cache is updated.

Disk space isn't much of an issue because I'm not generating a full cache set for each image. But I could simply delete the full preview cache directory (6) after culling, and just keep the lighttable thumbnails (3).

My solution

I wrote a script called responsive_cache, which generates the cache I need and only the cache I need.

I could also write a cache cleaner script that could clean probably based on time, but that could also be triggered manually.

Do you think this would work for you?

wpferguson avatar Aug 18 '25 18:08 wpferguson

@wpferguson: thanks for looking into this. Yes, I would still be interested in a solution.

I would prefer a large-resolution cache image generated automatically upon import (preferably after a default style base been applied, otherwise it is wasted work), and each time the image was edited (eg when the user left the darkroom, or switched to another image).

Cache culling based on generation time would be fine. Eg remove all cache files older than 3 months.

tpapp avatar Aug 19 '25 07:08 tpapp

This issue has been marked as stale due to inactivity for the last 60 days. It will be automatically closed in 300 days if no update occurs. Please check if the master branch has fixed it and report again or close the issue.

github-actions[bot] avatar Oct 19 '25 00:10 github-actions[bot]