mahotas
mahotas copied to clipboard
Zernike moments - Getting wiered values after above 20 degree
Trying to extract ZM from two images and compare them with opencv norm function to measure similarity, It works well for below 20 Degree but it doesn't capture detail information of images so for some images i am getting wrong similarity.
I tried to increase Degree to 60 Degree and found out that same image from different image giving Negative number like -88835.23 which should be between 0 to 1
can you please look into it and resolve the issue code below desc = mahotas.features.zernike_moments(imres, 200, degree=60, cm=200,200) desc1 = mahotas.features.zernike_moments(imres1, 200, degree=60, cm=200,200) score = cv2.norm(desc, desc1, cv2.NORM_L2)
Thanks in advance
Thanks. Would it be possible for you to upload the corresponding images?
It may be that numerically, it just blows up.
I am Soo Sorry for late answering. I was playing with SIFT and Bag-of-Visual-Words. will check and confirm if it is the numbers. also can the computations done in parallel to save time. CPU cores and memory are cheap but not time
Hi @luispedro ,
Thanks for this very useful package!
I have a similar question on the magnitude of ZMs at higher degrees. At high degrees, the moments seem to blow up.
Below is a simple example for a binary image of a circle, with plots showing ZM against index. In this example, after degree 50, the ZMs at certain indices grow orders of magnitude larger.
Another example, for a star. The moments blow up around degree 60, so this isn't consistent across inputs.
Is this behaviour due to some numerical issue?
Thanks!
Code and package versions:
- python 3.7.9
- mahotas 1.4.13
- numpy 1.21.6
- matplotlib 3.5.3
- skimage 0.19.3
import numpy as np
import matplotlib.pyplot as plt
import mahotas as mh
from skimage.draw import polygon2mask
# Create an image of a circle
theta = np.linspace(0, 2*np.pi, 50)
x = 100 * np.sin(theta) + 128
y = 100 * np.cos(theta) + 128
circ = np.array([x,y]).T
im = polygon2mask((256, 256), circ)*1.
# Generate Zernike moments for degrees=0, 10, ..., 100
deg = list(range(10,81,10))
plt.figure(figsize=(40,5))
plt.subplot(1, len(deg)+1, 1)
plt.imshow(im, cmap='gray')
plt.title('Input im')
for i in range(len(deg)):
print(i)
z = mh.features.zernike_moments(im, degree=deg[i], radius=100)
plt.subplot(1, len(deg)+1, i+2)
plt.title(f'Degree {deg[i]}')
plt.xlabel('Index')
plt.ylabel('ZM')
plt.plot(z)
plt.savefig('zernike1.png', bbox_inches='tight')
plt.close()
Thanks. It's been a while since I looked at this, but I think the numerics of these computations are such that they would have to be implemented much more carefully at higher moments for them to not blow up.
Yeah, it seems more likely to happen when n
is large and l
is small:
https://github.com/luispedro/mahotas/blob/1c84c258c1e4167d4e6864abb3d399703894464d/mahotas/features/zernike.py#L97
which is when the factorials become large:
https://github.com/luispedro/mahotas/blob/1c84c258c1e4167d4e6864abb3d399703894464d/mahotas/features/_zernike.cpp#L60-L64
But even then for n=50, l=0
these computations are way under the threshold of the double limit.
Do you have any intuition of where the blow up is occurring?