mesa-lima
mesa-lima copied to clipboard
Implement depth support
Info about polygon offset calculation which i've found:
About integer part of offset:
- min value is -32 and max is 31
- it's calculated in following way:
(value << 18) & ~0xff000000;
About decimal part of offset:
- precision to 1 decimal place (1,1 value is equal to 1,15)
Fractal value of polygon offset -> target value in hex
0 - 2 -> 0x00000000
3 - 4 -> 0x00010000
5 - 7 -> 0x00020000
8 - 9 -> 0x00030000
If offset < 0 then result = integer part - fractal part else result = integer part + fractal part
Test data:
( 1.000000) -> 0x00040000
( 2.000000) -> 0x00080000
( 3.000000) -> 0x000c0000
( 16.000000) -> 0x00400000
( 30.000000) -> 0x00780000
( 31.000000) -> 0x007c0000
( 32.000000) -> 0x007c0000
( -1.000000) -> 0x00fc0000
( -16.000000) -> 0x00c00000
( -30.000000) -> 0x00880000
( -31.000000) -> 0x00840000
( -32.000000) -> 0x00800000
( 1.000000) -> 0x00040000
( 1.100000) -> 0x00040000
( 1.200000) -> 0x00040000
( 1.300000) -> 0x00050000
( 1.400000) -> 0x00050000
( 1.500000) -> 0x00060000
( 1.600000) -> 0x00060000
( 1.700000) -> 0x00060000
( 1.800000) -> 0x00070000
( 1.900000) -> 0x00070000
( -1.000000) -> 0x00fc0000
( -1.100000) -> 0x00fc0000
( -1.200000) -> 0x00fc0000
( -1.300000) -> 0x00fb0000
( -1.400000) -> 0x00fb0000
( -1.500000) -> 0x00fa0000
( -1.600000) -> 0x00fa0000
( -1.700000) -> 0x00fa0000
( -1.800000) -> 0x00f90000
( -1.900000) -> 0x00f90000
TODO find out what is the place where polygon offset unit is stored.
@PabloPL are you still working on it?
@anarsoul At this moment i don't have time to work more on this. At first post i've tried to describe what i've found when comparing mali binary dumps. I've also added some test data (which i gathered and was using to verify my calculation). There is only one thing missing - polygon offset unit is stored in some different place in memory (not in depth_test).
If You want, be free to continue this.
Also i had to add https://github.com/PabloPL/mesa-lima/commit/20ac8f025525bc3bc4b46cb68dcb0da5a9621d22 (so we can have format which supports depth).
So looks like offset_scale is calculated like this:
int offset_scale;
if (rst->offset_scale < -32)
offset_scale = -32;
else if (rst->offset_scale > 31)
offset_scale = 31;
else
offset_scale = rst->offset_scale * 4;
if (offset_scale < 0)
offset_scale = 0x100 + offset_scale;
According to limare, polygon_offset_units is used to adjust viewport->translate[2], but it does some weird cast from float to int and then just substracts polygon_offset_units from viewport->translate[2]. Unfortunately I don't understand how that's supposed to work.
viewport transform will be applied to each vertex at the end of GP shader: V * viewport_scalar + viewport_translate viewport->translate[2] is the Z value.
@yuq you misunderstood me, I don't understand how this's supposed to work:
https://github.com/limadriver-ng/lima/blob/master/limare/lib/limare.c#L848
Note that state->viewport_transform[6] is float. So basically the code casts float pointer to int pointer and then does some math on int.
Least significant bits (22 to 0) of float contain fraction part, but the code doesn't check exponent value:
https://en.wikipedia.org/wiki/Single-precision_floating-point_format
Does anyone know why eglinfo shows now depth/stencil visuals?
# eglinfo
EGL API version: 1.4
EGL vendor string: Mesa Project
EGL version string: 1.4 (DRI2)
EGL client APIs: OpenGL OpenGL_ES
EGL extensions string:
EGL_EXT_buffer_age EGL_EXT_image_dma_buf_import EGL_KHR_cl_event2
EGL_KHR_config_attribs EGL_KHR_create_context
EGL_KHR_create_context_no_error EGL_KHR_fence_sync
EGL_KHR_get_all_proc_addresses EGL_KHR_gl_renderbuffer_image
EGL_KHR_gl_texture_2D_image EGL_KHR_gl_texture_cubemap_image
EGL_KHR_image EGL_KHR_image_base EGL_KHR_image_pixmap
EGL_KHR_no_config_context EGL_KHR_reusable_sync
EGL_KHR_surfaceless_context EGL_EXT_pixel_format_float
EGL_KHR_wait_sync EGL_MESA_configless_context EGL_MESA_drm_image
EGL_MESA_image_dma_buf_export EGL_WL_bind_wayland_display
EGL client extensions string:
EGL_EXT_client_extensions EGL_EXT_platform_base
EGL_KHR_client_get_all_proc_addresses EGL_KHR_debug
EGL_EXT_platform_wayland EGL_MESA_platform_gbm
Configurations:
bf lv colorbuffer dp st ms vis cav bi renderable supported
id sz l r g b a th cl ns b id eat nd gl es es2 vg surfaces
---------------------------------------------------------------------
0x01 32 0 8 8 8 8 0 0 0 0 0x34325241-- y y y win
0x02 24 0 8 8 8 0 0 0 0 0 0x34325258-- y y y win
You mean "no" or "now"?
I meant 'no', but I figured that out - I added formats to lima_screen.c, but used stale libraries, so eglinfo showed no depth/stencil visuals.
@yuq do you know how to attach depth/stencil buffer? Basically if I just enable depth test it works somehow, but I'm not sure where it stores Z values, because ctx->framebuffer.zsbuf isn't used anywhere
One way I can think of is creating a gbm_bo and attach it to FBO's GL_DEPTH_ATTACHMENT.
@yuq I'm talking about lima. See lima_set_framebuffer_state() in lima_state.c, framebuffer->zsbuf is only used in this function and not anywhere else.
Sorry, I don't know either. The zsbuf I just write it there for reminder, haven't actually use it. But from the lima-ng, I also can't find a dedicated depth buffer attached for each draw. Maybe a reverse engineer dump is needed for this case.
I checked the dump and I don't understand where it stores depth values.
Could this possible:
- for app not use the depth buffer explicitly (attached to FBO's depth slot), mali don't need a depth buffer at all
- for explicit usage, one drm_lima_pp_wb_reg is used for it, as the type field comment 1 is for depth
So maybe you can try to dump a app using FBO GL_DEPTH_ATTACHMENT.
@yuq you're right - mali400 doesn't require in-memory depth buffer, see http://www.highperformancegraphics.org/previous/www_2010/media/Hot3D/HPG2010_Hot3D_ARM.pdf page 9:
Z, stencil, MSAA samples never go off-chip
So it uses on-chip 16x16 tile buffer for Z and stencil.