caffe
caffe copied to clipboard
Why "USE_CACHE_WHEN_FULL" option is diappeared?
When i search the internet , I found the legacy code of intel/caffe. In this code, there is use_cache_when_full option which can reduce network overhead. Why this option is not exist in current intel caffe?
Below is read.md of old version of caffe , Maybe that version isn't official version?
https://libraries.io/github/intel/caffe
Regards, Yeo sangho
it's old version, has been updated at 2016. pls refer to latest readme/wiki to do multi-node training.
@ftian1 I mean that it is very useful option when we train caffe with multi-node setting. When we load the image with shuffling. this option can help to reduce Disk I/O bottleneck. but current version, that option is not exist. why this option is disappeared?
why it's a useful option for your case? it's just preallocate blobs and same with prefetch of new data layer. did you compare your perf running on old multi-node code with latest implementation?
Actually, I'm not running both of intel implementation of caffe. But I assumed when using shared file system, it will be good option for training.
below the links, they say.. https://github.com/amrege/caffecombine
"It can also be used to cache data from the server in order to reduce the network traffic. Use only tcp protocol with data server. In the case of choosing caching policy USE_CACHE_WHEN_FULL, it will first download cache_size batches and then will randomized the cached data for actual training."
as they said, I think that local shuffling in each node will reduce Network/Disk I/O overhead.
the latest data reader has considered this. there is no such option needed.