`pole_density_function()` plots do not appear correct in the center
The center of plots produced by pole_density_function() do not appear to be correct. For example https://orix.readthedocs.io/en/stable/examples/plotting/subplots.html#sphx-glr-examples-plotting-subplots-py. The center of the plot appear distorted and I have similar issues with my pole density plots also.
Hi @yencal,
I believe they are correct, as @harripj who added that functionality intended. He recently suggested that we add an option to plot the densities as contours (https://github.com/pyxem/orix/discussions/440#discussioncomment-5470704) instead of the current histogram bins. I'm not sure when that will be added, though...
Hi @hakonanes, that would be great as most people are used to seeing the plots as contours. Thanks
@yencal thanks for opening an issue about this. To properly measure the pole distribution on the unit sphere we use equal area sampling on S2 rather than equal angle sampling. This is to ensure that the probability that a given pole intersects a bin on the S2 sphere is constant. As you point out, this means that the bins near the poles have a larger latitudinal dimension when using equal area sampling, which appears to distort pole region of the distribution.
I've written a short example highlighting the two sampling methods, and the resulting distribution for a set of random vectors (which should then give a uniform distribution):
from orix.vector import Vector3d
from orix import plot
from orix.sampling.S2_sampling import _sample_S2_uv_mesh_coordinates, _sample_S2_equal_area_coordinates
from matplotlib import pyplot as plt
import numpy as np
from orix.projections import StereographicProjection
v = Vector3d(np.random.randn(1_000_000, 3))
azimuth, polar, _ = v.to_polar()
resolution = 2 # deg.
sp = StereographicProjection()
# equal angle grid
ac, pc = _sample_S2_uv_mesh_coordinates(resolution, hemisphere='upper', azimuth_endpoint=True)
hist_uv, *_ = np.histogram2d(
azimuth,
polar,
bins=(ac, pc),
density=False,
)
uv_grid = Vector3d.from_polar(*np.meshgrid(ac, pc, indexing='ij'))
x_uv, y_uv = sp.vector2xy(uv_grid)
# equal area grid
ac, pc = _sample_S2_equal_area_coordinates(resolution, hemisphere='upper', azimuth_endpoint=True)
hist_ea, *_ = np.histogram2d(
azimuth,
polar,
bins=(ac, pc),
density=False,
)
ea_grid = Vector3d.from_polar(*np.meshgrid(ac, pc, indexing='ij'))
x_ea, y_ea = sp.vector2xy(ea_grid)
fig, ax = plt.subplots(nrows=2, ncols=2, figsize=(10, 10), subplot_kw=dict(projection='stereographic'))
ax[0, 0].scatter(uv_grid, s=0.5)
ax[0, 1].scatter(ea_grid, s=0.5)
ax[0, 0].set_title('Equal angle grid points')
ax[0, 1].set_title('Equal area grid points')
pc_uv = ax[1, 0].pcolorfast(x_uv.reshape(uv_grid.shape), y_uv.reshape(uv_grid.shape), hist_uv / hist_uv.mean())
pc_ea = ax[1, 1].pcolorfast(x_ea.reshape(ea_grid.shape), y_ea.reshape(ea_grid.shape), hist_ea / hist_ea.mean())
plt.colorbar(pc_uv, ax=ax[1, 0], label='MRD')
plt.colorbar(pc_ea, ax=ax[1, 1], label='MRD')
ax[1, 0].set_title('Distribution on equal angle grid')
ax[1, 1].set_title('Distribution on equal area grid')

From the two lower plots it's clear that the distribution is uniform as expected when using equal area sampling, but is underrepresented at the poles when using equal angle sampling for the reason discussed above. This is discussed in further detail in the paper by Rohrer (RSD+04 in the orix bibliography).
On the topic of contour plots this would definitely be a nice addition to orix, and please feel free to contribute if you already have an implementation written!
Hi @harripj, thanks for the explanation and example code.
I've recently been seeing several examples of these pole figures with artifacts in the center in the wild (inlcluding published scientific papers!) and they are driving me crazy so I tracked them down to here!
If I fix this by doing the histogram'ing on a better projection. For example a gnomonic cubed-sphere grid instead of a E.A. longitude-latitude grid and then return 6 separate histograms in orix.measure.pole_density_function. Does anyone have an overview of how many dependent functions I would have to touch?
And would you merge it?
And would you merge it?
Always. If you write something that passes the unit tests, follows the project guidelines, and vaguely fits within ORIX's goals, we will always happily include it. Here's our link to contributor guidelines if you choose to go that way. https://orix.readthedocs.io/en/stable/dev/index.html
A few thoughts if you go down this route:
Orix has several S2 sampling methods already, you can see an example comparing several here.
Your response i think is the right idea: the problem is the binning is done in an equal-area polar-azimuth projection, which creates the enlongated bins. Almost a year ago, I opened up This Issue trying to solve the problem by binning using a KD-Tree, which worked but never turned into a Proper PR. Your projection approach seems potentially cleaner though, I like it.
If you want to try fixing this, go for it. You can also start a PR as a draft if you get stuck, and one of us can help you. Also, feel free to steal what you want from my PR, or ignore it entirely.
Also, to your question:
Does anyone have an overview of how many dependent functions I would have to touch?
If make changes in orix.sampling it can get complicated fast, but if you are just updating InversePoleFigurePlot, that class is mostly a standalone function, so it would just be updating the examples and tutorials. we can also help with that if you want.
Best of luck, and feel free to ask for clarification
Oh, I completely forgot I did this, but I guess I wrote a method for binning in the equal-area stereographic projection as part of #532 as well. Might be of interest to you; It works, but as I mention, it has it's own flaws.
I think I have a concept where I don't change so much, but I'm trying to figure our how/where/why you do the smoothing of the histograms. Is this some setting of pyplot.pcolormesh or is there a smoothing funciton called on the long.-lati. somewhere, that I cannot find?
Eg. in: https://orix.readthedocs.io/en/stable/tutorials/pole_density_function.html
That still would not work with an "atlas" approach like I'm sugesting. For that you really should be doing kernel-smoothing, but it seems like Wigner-funcitons (or really any shparm routines?) isn't implemented yet. Would it be OK to only have non-smoothed histograms. They look like this:
import numpy as np
import matplotlib.pyplot as plt
def assign_cube_face_index(unit_vectors):
""" Assigns an index (1 to 6) to an array of N 3-vectors (`shape = (N, 3)`) assigning them each to
a face of a cube in the followiung way:
Index 1 is positive x face. Includes +x+y and +x+z edges.
Index 2 is negative x face. Includes -x-y and -x-z edges.
Index 3 is positive y face. Includes +y+z and +y-x edges.
Index 4 is negative y face. Includes -y-z and -y+x edges.
Index 5 is positive z face. Includes -y+z and -x+z edges.
Index 6 is negative z face. Includes +y-z and +x-z edges.
Corners get assigned to the face with higher index. (Let's just say undefined behaviour...)
"""
face_index = np.zeros(vectors.shape[0], dtype=int)
indx = np.all([vectors[:, 0] >= vectors[:, 1],
vectors[:, 0] > -vectors[:, 1],
vectors[:, 0] >= vectors[:, 2],
vectors[:, 0] > -vectors[:, 2],], axis = 0)
face_index[indx] = 1
indx = np.all([vectors[:, 0] < vectors[:, 1],
vectors[:, 0] <= -vectors[:, 1],
vectors[:, 0] < vectors[:, 2],
vectors[:, 0] <= -vectors[:, 2],], axis = 0)
face_index[indx] = 2
indx = np.all([vectors[:, 1] > vectors[:, 0],
vectors[:, 1] >= -vectors[:, 0],
vectors[:, 1] >= vectors[:, 2],
vectors[:, 1] > -vectors[:, 2],], axis = 0)
face_index[indx] = 3
indx = np.all([vectors[:, 1] <= vectors[:, 0],
vectors[:, 1] < -vectors[:, 0],
vectors[:, 1] < vectors[:, 2],
vectors[:, 1] <= -vectors[:, 2],], axis = 0)
face_index[indx] = 4
indx = np.all([vectors[:, 2] > vectors[:, 0],
vectors[:, 2] >= -vectors[:, 0],
vectors[:, 2] > vectors[:, 1],
vectors[:, 2] >= -vectors[:, 1],], axis = 0)
face_index[indx] = 5
indx = np.all([vectors[:, 2] <= vectors[:, 0],
vectors[:, 2] < -vectors[:, 0],
vectors[:, 2] <= vectors[:, 1],
vectors[:, 2] < -vectors[:, 1],], axis = 0)
face_index[indx] = 6
return face_index
def make_sqare_gnomonic_histograms(unit_vectors, bins = 10, friedel_sym=False, MRD_units=False):
# assert bins%2 == 0
bin_edges = np.linspace(-np.pi/4, np.pi/4, bins+1)
bin_middles = np.linspace(-np.pi/4 + np.pi/4/bins, np.pi/4 - np.pi/4/bins, bins)
y_ang, z_ang = np.meshgrid(bin_middles, bin_middles)
solid_angle_term = 1 / (np.tan(y_ang)**2 + np.tan(z_ang)**2 + 1) / (np.cos(y_ang)*np.cos(z_ang)) / (1 - 0.5*(np.sin(z_ang) * np.sin(y_ang))**2)
# Radius factor perspecitve angle factor oblique coords #TODO check if last factor is correct or small-angle approx
solid_angle_term *= 4 * np.pi / 6 / np.sum(solid_angle_term)
if MRD_units:
solid_angle_term *= unit_vectors.shape[0] / 4 / np.pi
# binedges
dim_1, dim_2 = np.meshgrid(np.tan(bin_edges), np.tan(bin_edges))
norm_term = np.sqrt(dim_1**2 + dim_2**2 + 1)
dim_1 /= norm_term
dim_2 /= norm_term
dim_3 = np.ones(dim_2.shape) / norm_term
face_indexes = assign_cube_face_index(unit_vectors)
histograms = []
# Face 1
y_gnom = np.arctan(vectors[face_indexes == 1, 1] / vectors[face_indexes == 1, 0])
z_gnom = np.arctan(vectors[face_indexes == 1, 2] / vectors[face_indexes == 1, 0])
hist, _ = np.histogramdd((y_gnom, z_gnom), bins=(bin_edges, bin_edges))
vertex_vectors = np.stack([dim_3, dim_2, dim_1], axis=-1)
histograms.append((hist / solid_angle_term, vertex_vectors))
# Face 2
y_gnom = np.arctan(-vectors[face_indexes == 2, 1] / vectors[face_indexes == 2, 0])
z_gnom = np.arctan(-vectors[face_indexes == 2, 2] / vectors[face_indexes == 2, 0])
hist, _ = np.histogramdd((y_gnom, z_gnom), bins=(bin_edges, bin_edges))
vertex_vectors = np.stack([-dim_3, dim_2, dim_1], axis=-1)
histograms.append((hist / solid_angle_term, vertex_vectors))
# Face 3
x_gnom = np.arctan(vectors[face_indexes == 3, 0] / vectors[face_indexes == 3, 1])
z_gnom = np.arctan(vectors[face_indexes == 3, 2] / vectors[face_indexes == 3, 1])
hist, _ = np.histogramdd((x_gnom, z_gnom), bins=(bin_edges, bin_edges))
vertex_vectors = np.stack([dim_2, dim_3, dim_1], axis=-1)
histograms.append((hist / solid_angle_term, vertex_vectors))
# Face 4
x_gnom = np.arctan(-vectors[face_indexes == 4, 0] / vectors[face_indexes == 4, 1])
z_gnom = np.arctan(-vectors[face_indexes == 4, 2] / vectors[face_indexes == 4, 1])
hist, _ = np.histogramdd((x_gnom, z_gnom), bins=(bin_edges, bin_edges))
vertex_vectors = np.stack([dim_2, -dim_3, dim_1], axis=-1)
histograms.append((hist / solid_angle_term, vertex_vectors))
# Face 5
x_gnom = np.arctan(vectors[face_indexes == 5, 0] / vectors[face_indexes == 5, 2])
y_gnom = np.arctan(vectors[face_indexes == 5, 1] / vectors[face_indexes == 5, 2])
hist, _ = np.histogramdd((x_gnom, y_gnom), bins=(bin_edges, bin_edges))
vertex_vectors = np.stack([dim_2, dim_1, dim_3], axis=-1)
histograms.append((hist / solid_angle_term, vertex_vectors))
# Face 6
x_gnom = np.arctan(-vectors[face_indexes == 6, 0] / vectors[face_indexes == 6, 2])
y_gnom = np.arctan(-vectors[face_indexes == 6, 1] / vectors[face_indexes == 6, 2])
hist, _ = np.histogramdd((x_gnom, y_gnom), bins=(bin_edges, bin_edges))
vertex_vectors = np.stack([dim_2, dim_1, -dim_3], axis=-1)
histograms.append((hist / solid_angle_term, vertex_vectors))
if friedel_sym:
histograms[0] = (0.5 * (histograms[0][0] + np.flip(histograms[1][0])), histograms[0][1])
histograms[1] = (np.flip(histograms[0][0]), histograms[1][1])
histograms[2] = (0.5 * (histograms[2][0] + np.flip(histograms[3][0])), histograms[2][1])
histograms[3] = (np.flip(histograms[2][0]), histograms[3][1])
histograms[4] = (0.5 * (histograms[4][0] + np.flip(histograms[5][0])), histograms[4][1])
histograms[5] = (np.flip(histograms[4][0]), histograms[5][1])
return histograms
def plot_histograms(histograms, ax, vmin = None, vmax = None):
if vmin is None:
vmin = np.min(np.stack([hist_tuple[0] for hist_tuple in histograms]))
if vmax is None:
vmax = np.max(np.stack([hist_tuple[0] for hist_tuple in histograms]))
imgs = []
for hist_tuple in histograms[:5]: # Skip negative z plane (index 5), as it diverges.
vecs = hist_tuple[1]
polar_angle = np.arccos(vecs[:,:, 2])
azim_angle = np.arctan2(vecs[:,:, 1], vecs[:,:, 0])
img = ax.pcolormesh(azim_angle, np.tan(polar_angle/2), hist_tuple[0], vmin=vmin, vmax=vmax)
imgs.append(img)
ax.set_ylim([0, np.tan(np.pi/4)])
return imgs
if __name__ == "__main__":
vectors = np.random.randn(int(1e6), 3)
norm = np.linalg.norm(vectors, axis=1)
# assert np.all(norm != 0.0)
vectors = vectors / norm[:, np.newaxis]
histograms = make_sqare_gnomonic_histograms(vectors, bins=10, MRD_units=True, friedel_sym=True)
fig = plt.figure()
ax = plt.subplot(1,1,1,polar=True)
imgs = plot_histograms(histograms, ax)
plt.colorbar(imgs[0])
plt.show()
EDIT: Some wrong array-flips of sub-histograms.
Try playing with the shading parameter.
https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.pcolormesh.html
Matplotlib is actually a bit annoying with their smoothing. They usually have an "auto" parameter which can be confusing and inconsistent. I think if you increase the density by 3 or 4x it might automatically do it.
Also antialiasing being True might help with making the plot look nicer.
Looks great though!
More on shading:
https://matplotlib.org/stable/gallery/images_contours_and_fields/pcolormesh_grids.html#gouraud-shading
Is this some setting of pyplot.pcolormesh or is there a smoothing funciton called on the long.-lati. somewhere, that I cannot find?...
Plotting in ORIX typically begins with something like:
import matplotlib.pyplot
import orix.plot
from orix.vector import Vecto3d
The line orix.plot registers the class StereographicPlot as a custom matplotlib projection, which is handy because then you can do things like:
fig,ax = plt.subplots(1,2,projection='stereographic')
ax.scatter(Vector3d.random(100))
and just directly plot vectors without projecting them yourself. It also means EVERYTHING plotted uses an identical projection, which has been a majorly useful feature.
So, to answer your actual question, the function StereographicPlot.pole_density_function is doing both the plotting and projection. It, in turn, is calling orix.measure.pole_density_function, which is doing the binning and smoothing and is the function I think you want to change. Blurring is not done at the colormesh stage.
bringing this around to @CSSFrancis comment:
Try playing with the shading parameter.
I tried this when writing #532, but:
- pcolormesh-level blurring is very slow for ungridded data with a mesh resolution of 2 degrees or less.
- It's not mathematically valid for our equal-angle Stereographic projections, because they have a radially dependent Jacobian.
You can potentially do this blurring in an equal-area projection that you then re-project to equal-angle stereogrpahic, which is what I did in #532 and solves both problems.
...or that you really should be doing kernel-smoothing, but it seems like Wigner-funcitons (or really any shparm routines?) isn't implemented yet.
Yup, and you've just hit on part of why pole_density_function is still in a lone class. An assumption in the past is orix.measure would eventually get replaced with a proper convolution method, but that never happened, and orix.measure.pole_density_function remaining as-is for so long is a partial side-effect of that.
Would it be OK to only have non-smoothed histograms.
Unfortunately, no. Dropping histogram smoothing at this point would break functionality for potentially thousands of workflows.
Thanks, I found the blurring operation now.
The only concept for a solution I could come up with without spherical harmonics and not super expensive (like the pcolormesh commands) is to do a simulated diffusion on the cube-grid. It's still not completely isotropic, but it's a lot closer than the longitude-latitude approach. Proper gaussian blurring on the cube faces would also work but then the boundary conditions become even more bothersome.
The result looks like this:
It runs in less in a fraction of a second for reasonable parameters (1° step grids and ~10° blurring kernel).
If that seams reasonable, I will try to implement it.
And code dump just in case:
def diffusion_step(histograms, step_parameter, iterations=1):
sub_histogram_shape = histograms[0][0].shape
print(sub_histogram_shape)
output_histograms = []
for face_index in range(6):
output_histograms.append(np.copy(histograms[face_index][0]))
for n in range(iterations):
diffused_weight_maps = []
# Diffuse on faces
for output_histogram in output_histograms:
diffused_weight_map = np.zeros(sub_histogram_shape)
diffused_weight_map[1:, :] += output_histogram[:-1, :]
diffused_weight_map[:-1, :] += output_histogram[1:, :]
diffused_weight_map[:, 1:] += output_histogram[:, :-1]
diffused_weight_map[:, :-1] += output_histogram[:, 1:]
diffused_weight_maps.append(diffused_weight_map)
### Diffuse across edges ###
# +y+z edge
diffused_weight_maps[2][:, -1] += output_histograms[4][:, -1]
diffused_weight_maps[4][:, -1] += output_histograms[2][:, -1]
# -y+z edge
diffused_weight_maps[3][:, -1] += output_histograms[4][:, 0]
diffused_weight_maps[4][:, 0] += output_histograms[3][:, -1]
# +y-z edge
diffused_weight_maps[2][:, 0] += output_histograms[5][:, -1]
diffused_weight_maps[5][:, -1] += output_histograms[2][:, 0]
# -y-z edge
diffused_weight_maps[3][:, 0] += output_histograms[5][:, 0]
diffused_weight_maps[5][:, 0] += output_histograms[3][:, 0]
# +x+z edge
diffused_weight_maps[0][:, -1] += output_histograms[4][-1, :]
diffused_weight_maps[4][-1, :] += output_histograms[0][:, -1]
# -x+z edge
diffused_weight_maps[1][:, -1] += output_histograms[4][0, :]
diffused_weight_maps[4][0, :] += output_histograms[1][:, -1]
# +x-z edge
diffused_weight_maps[0][:, 0] += output_histograms[5][-1, :]
diffused_weight_maps[5][-1, :] += output_histograms[0][:, 0]
# -x-z edge
diffused_weight_maps[1][:, 0] += output_histograms[5][0, :]
diffused_weight_maps[5][0, :] += output_histograms[1][:, 0]
# +x+y edge
diffused_weight_maps[0][-1, :] += output_histograms[2][-1, :]
diffused_weight_maps[2][-1, :] += output_histograms[0][-1, :]
# -x+y edge
diffused_weight_maps[1][-1, :] += output_histograms[2][0, :]
diffused_weight_maps[2][0, :] += output_histograms[1][-1, :]
# -x-y edge
diffused_weight_maps[1][0, :] += output_histograms[3][0, :]
diffused_weight_maps[3][0, :] += output_histograms[1][0, :]
# +x-y edge
diffused_weight_maps[0][0, :] += output_histograms[3][-1, :]
diffused_weight_maps[3][-1, :] += output_histograms[0][0, :]
# Add to output
for face_index, output_histogram in enumerate(output_histograms):
output_histograms[face_index] = (1-step_parameter)*output_histogram+diffused_weight_maps[face_index]/4*step_parameter
# Wrap up into list of tuples format
output = []
for face_index, hist_tuple in enumerate(histograms):
output.append((output_histograms[face_index],hist_tuple[1]))
return output
Hey Hey, that looks pretty solid to me!
My biggest concern with this approach was you would see the slight gridding effect of the projected cube, but that image looks really good. I tried this myself on some data and can confirm, it solves in reasonable times.
This is excellent, a definite improvement over the current default. If you open a Pull Request, I'll review it and we can merge it. Also, it doesn't need to be perfect to be a PR, feel free to get just a rough draft and we can go from there.
In terms of the least painful way to add to ORIX, I'd suggest refactoring all this into just the orix.measure.pole_density_function. As long as you have all the same inputs and outputs, everything should mesh in nicely with existing funcitons.
Also, if you get stuck on the implementation, I can start a PR with you as the co-author, and we can go from there. I'm indifferent to the method, as long as this gets added and you get properly credited.
Hi @MACarlsen, thanks for wanting to contribute!
As @argerlt says, the "pole density function" function is the one to add to. The grid of (polar, azimuth) intensities is smoothed using SciPy's gaussian filter: https://github.com/pyxem/orix/blob/a6ff478dc063abcd51808cbe604e1e0d4779a41e/orix/measure/pole_density_function.py#L131-L142
Thanks for the tips! I always get confused by these github message boards (what goes in the issue what goes in the PR)
I have a working version over in #588. Feel free to take it over from here.
I did have to break some assumptions of the existing tests, which I tried to explain in the PR. Short version is that the equal-area polar grid is bad for plotting and even worse for smoothing.
I was struggling a bit with the tests because the custom projections don't get loaded on my machine without some manual tinkering with matplotlib.projections. I think you're aware of the problem.
But otherwise the package seems nice. The multi-dimensional arrays of orientations is a big convenience over scipy.spatial.transform. Rotation. I might use this if I ever start a new project!