PRo3D
PRo3D copied to clipboard
Panoramic Texture Projection onto DTM Surface
Add the respective panoramic texture view rendered from the HiRISE ortho image as output option. (In case of a priority-rendered photosphere, this one is rendered instead.)
- Input: Spherical (or cylindric - in future - ) projected panorama (photosphere) with N texture channels (RGB, multispectral, accuracy, ...), and projection center, resolution;
- Output: Similar to oriented shot. OPC, image and pose are combined in the shader. Start with oriented shot existing implementation. It may be necessary to tile the spherical projection (180° as limit?).
This issue is related to #411.
Also related to #353
How large are the panorama texture? The current GPU limit is 32k (in our case the x-axis).
The import size is limited by 32k for each coordinate axis. Resampling is done when importing, if necessary. Alternative: Use array textures. Further handling of this is done as additional temporary (texture) layer.
@gpaar can we get an example panorama for some known location to use it for testing?
Small ones: 0382 3335 (a multispectral one onto the crater rim current traverse) 0397 8420 (Moro rock)
Larger one:
1357 9423 (lookout point onto full Jezero traverse)
All these products are spherical geometry, the Job_xxxx_yyyy-zzz-rad_mosaic.zip, containing meta data in the .json, mainly in the map_transformation entry. Liaise with @ArnoldBauer on units and meaning, he will also re-process until end of this week to provide also geographic and Mars-centered sphere positions. Presently, angles are given in geographic coordinate system with a position presumably only in site frame, the pointer to site is missing and will be added in the newly processed products.
The mosaic.zip includes a metadata file Texture.tif.json. It applies to both Texture.tif (3-channel float texture) and Texture_uint8.jpg (which is a uint8 JPG conversion).
The relevant metadata in this case are map_transformation and geometry_information.
map_transformation describes the geo-coding, i.e. the transformation between image and map coordinates. In this case the affine transformation is to map image coordinates (column, row) to spherical map coordinates (horizontal angle / azimuth, vertical angle / polar in Gradian=Gon). Documentation from JR::DibGeometry::Geometry::CLinear2DTrafo: The transformation is defined by 6 parameters via the formulas x_u = A0 + A1*x + A2*y and y_u = B0 + B1*x + B2*y, where (x,y) defines a point in the input coordinate system (mostly a location as column/line coordinates in an image) and (x_y, y_u) is the same point in the user coordinate system, i.e. the coordinate system defined by the transformation parameters (A0, A1, A2, B0, B1, B2). Format: Comma-separated list of float values: <A0>, <A1>, <A2>, <B0>, <B1>, <B2> with A0 describing the shift in x-direction, A1 the spacing in x-direction, A2 the spacing in x-direction when changing the input coordinate in y-direction, B0 the shift in y-direction, B1 the spacing in y-direction when changing the input coordinate in x-direction, and B2 the spacing in y-direction.
"map_transformation": [
-19.0, -> shifting in x, column offset
0.001, -> scaling in x, column spacing
0.0, -> shearing in x
87027.62, -> shifting in y, row offset
0.0, -> shearing in y
0.001 -> scaling in y, row offset
],
geometry_information: In this case we always have a spherical coordinate system, which is defined as follows:
(r, θ, φ)with radial distance r (distance to origin), polar angle θ (theta; angle with respect to the polar axis), and azimuthal angle φ (phi; angle of rotation from the initial meridian plane). See Wikipedia: Spherical coordinate system- IMPRO notation:
(h,v,d)with azimuthal angle h=φ, polar angle v=θ, and radial distance d=r - Spherical coordinate system is associated to site frame
- Units are Gradian (Gon) for azimuth and polar angle, and meter for radial distance.
Komyo is currently adding the following (still missing) information to the GPC metadata:
- Coordinate system
- Units
- Rover site & frame
- Site-to-Mars transformation
Actually the Site-to-Mars transformation is available, but not added to the resulting product zips. For Job_0382_3335-110-rad it is as follows: Local2Mars.sop.json
This one can be realized by exporting a panorama depth map for a partiulcar viewpoint and use the depth map in the processing pipeline directly. However, it is needed to export the transformation of a photosophere adjustment. @gpaar This adjustment could be done using an opc photosophere, right? Or are we talking about "live" projection of a small panorama texture?
Yes the adjustment will be done manually (and in future by horizon matching) on the already existing standard-PRoViP produced Photosphere.
Dependent on #518.