intel-extension-for-pytorch
intel-extension-for-pytorch copied to clipboard
Caching JIT to local disk to speed up
Describe the issue
If IPEX is not built with AOT for current device enabled, JIT will generate objects for current GPU (XPU). However, even if I run the same codes with same parameters after restarting Python, JIT will compile again, which takes a lot of time. I'm wondering if there is a way to store compiled objects in local disk cache?
BTW, it would be much helpful if I can determine whether AOT or JIT cache or JIT is available without running a function.