PaDiM-Anomaly-Detection-Localization-master icon indicating copy to clipboard operation
PaDiM-Anomaly-Detection-Localization-master copied to clipboard

It also looks good to save conv_inv.

Open dhkdnduq opened this issue 4 years ago • 1 comments

main.py

origin code : for i in range(H * W): # cov[:, :, i] = LedoitWolf().fit(embedding_vectors[:, :, i].numpy()).covariance_ cov[:, :, i] = np.cov(embedding_vectors[:, :, i].numpy(), rowvar=False) + 0.01 * I # save learned distribution train_outputs = [mean, cov]

change to : for i in range(H * W): # cov[:, :, i] = LedoitWolf().fit(embedding_vectors[:, :, i].numpy()).covariance_ cov[:, :, i] = np.cov(embedding_vectors[:, :, i].numpy(), rowvar=False) + 0.01 * I # save learned distribution conv_inv = np.linalg.inv(cov.T).T train_outputs = [mean, cov, conv_inv]

how to using :

dist_list = [] for i in range(H * W): mean = train_outputs[0][:, i] #conv_inv = np.linalg.inv(train_outputs[1][:, :, i]) dist = [mahalanobis(sample[:, i], mean, train_outputs[2][:, :, i]) for sample in embedding_vectors] dist_list.append(dist)

in my opinion

dhkdnduq avatar Mar 12 '21 01:03 dhkdnduq

you do not need to save cov as well

//save only train_outputs = [mean, conv_inv]

DeepKnowledge1 avatar Mar 12 '21 07:03 DeepKnowledge1