LacaK
LacaK
Same issue in discussions: https://github.com/orgs/utmstack/discussions/599#discussioncomment-9260240
Thanks. I am asking because I want to do element wise update of both weights using: weights+=learning_rate*diff_weights So I need to ensure, that memory layout (dimensions, layout (format), element type)...
Thanks. I did experiments using - fw_weights_layer_md = dnnl_primitive_desc_query_md(**lstm_fw_pd**, dnnl_query_exec_arg_md, DNNL_ARG_WEIGHTS_LAYER); - bw_weights_layer_md = dnnl_primitive_desc_query_md(**lstm_bw_pd**, dnnl_query_exec_arg_md, DNNL_ARG_WEIGHTS_LAYER); - bw_diff_weights_layer_md = dnnl_primitive_desc_query_md(lstm_bw_pd, dnnl_query_exec_arg_md, DNNL_ARG_DIFF_WEIGHTS_LAYER); And found strange thing: - dnnl_memory_desc_equal(fw_weights_layer_md, bw_diff_weights_layer_md)...
Yes that gives sense in relation to weight updates. But now I face problem, that when I execute **backward** primitive I get **access violation** exception in dnnl.dll So I am...