Aruna K
Aruna K
Hi, For inference how we can get op level timings in tensorflow Thanks
@penpornk , I wanted to understand the latest tensorflow supports eltwise onednn flow and also is there any document which I can refer for the onednn ops which are supported...
Hi, I am trying to understand how the torch jit helps to call onednn graph apis. I have run example code : ```python3 import torch import torchvision torch.jit.enable_onednn_fusion(True) model =...
In torch 2.3.0, With Reference to https://github.com/pytorch/pytorch/commit/0ae952db76e5da1ebfbcaff39897a5b42606e67b, want to mkldnn bf32 matmul in sapphire rapids but the default path is mkl. How I can enable this in SPR? cc: @zhuhaozhe...
Hello , I wanted to use tensorboard for inference, tf.profiler.experimental.start('logs') output = model.predict(encoded_input, verbose=False)[0] tf.profiler.experimental.stop() I have tried this way . I am not getting op level details of the...
Hi, I wanted to understand from framework like Tensorflow or Pytorch how we can enable graph compiler. cc:@ZhennanQin Thank you
I have tried integrating libxsmm with pytorch by LD_PRELOAD but I am unable to see the lib in perf . Is ther any other method I can integrate this library...
I have built openblas in graviton3E with make USE_OPENMP=1 NUM_THREADS=256 TARGET=NEOVERSEV1. mkl is built in icelake machine. I have used openblas sgemm as `cblas_sgemm(CblasRowMajor, CblasNoTrans, CblasNoTrans, M, N, K, 1.0,...
I want to run the unet model in cpu. I have tried net = torch.hub.load('milesial/Pytorch-UNet', 'unet_carvana', pretrained=True, scale=0.5) But I am getting error as map this to cpu when I...
Is there any support for sgemv int8/bf16 in mkl? In mkl_blas.h file I was not able to find any apis for bf16/int8 sgemv ,Is there any reason of not having...