bevy_water
bevy_water copied to clipboard
Normal calculation in shader
I might be misunderstanding something but shouldn't b be along the x axis and c along the y axis?
https://github.com/Neopallium/bevy_water/blob/68889d28b346b7ff47d793bf02e1464742f45127/assets/shaders/water.wgsl#LL109C1-L111C66
I just tried:
let b = get_wave_height(w_pos + vec2<f32>(1.0, 0.0));
let c = get_wave_height(w_pos + vec2<f32>(0.0, 1.0));
I don't notice any difference. I am not an expert in shaders or 3d math. To get the normal I just picked 2 neighboring points (3 including the current frag_coord) and made sure the normal would point above the surface.
The points currently are:
a*******
bc******
********
With 1.0, 0.0 and 0.0, 1.0 the points would be:
ab******
c*******
********
I am not sure what would be the best method of selecting points to calculate the normals. If there is a more standardized way picking the points, we can switch to that.
Any help with improving the water is greatly appreciated.
Might need to change the shader to match your Rust wave_normal:
https://github.com/Neopallium/bevy_water/blob/e36732668e2077591a8b965bc6d81a66c363c3ed/src/param.rs#LL56C1-L56C1
The ocean example has debug lines (use feature debug) that could be used to help check the Rust normals against the shader normals. I haven't found a good method to debug normals in shaders (can't just draw lines, need an extra shader/material).
I just tried:
let b = get_wave_height(w_pos + vec2<f32>(1.0, 0.0)); let c = get_wave_height(w_pos + vec2<f32>(0.0, 1.0));I don't notice any difference. I am not an expert in shaders or 3d math. To get the normal I just picked 2 neighboring points (3 including the current frag_coord) and made sure the normal would point above the surface.
The points currently are:
a******* bc****** ********With
1.0, 0.0and0.0, 1.0the points would be:ab****** c******* ********I am not sure what would be the best method of selecting points to calculate the normals. If there is a more standardized way picking the points, we can switch to that.
Any help with improving the water is greatly appreciated.
Usually the idea when calculating the surface normal is to find a vector perpendicular to the plane in the point. You can approximate this by using finite difference to get the tangents and a cross product to find a perpendicular vector. That's why I was confused by that. I too didn't see any difference however and I also haven't written a lot of shader code.
Might need to change the shader to match your Rust
wave_normal: https://github.com/Neopallium/bevy_water/blob/e36732668e2077591a8b965bc6d81a66c363c3ed/src/param.rs#LL56C1-L56C1The
oceanexample has debug lines (use featuredebug) that could be used to help check the Rust normals against the shader normals. I haven't found a good method to debug normals in shaders (can't just draw lines, need an extra shader/material).
Afaik the usual method is to use the color output to store the normals, but I can't really make any sense of that when looking at it.
I have a couple of minor simplifications for the shader if you want.
Might need to change the shader to match your Rust
wave_normal: https://github.com/Neopallium/bevy_water/blob/e36732668e2077591a8b965bc6d81a66c363c3ed/src/param.rs#LL56C1-L56C1 Theoceanexample has debug lines (use featuredebug) that could be used to help check the Rust normals against the shader normals. I haven't found a good method to debug normals in shaders (can't just draw lines, need an extra shader/material).Afaik the usual method is to use the color output to store the normals, but I can't really make any sense of that when looking at it.
Yeah, I have tried that with a primitive mesher (generated mesh), but it isn't very easy to understand.
When I have time I will try creating something like the bevy wireframe system that can render normals as lines over the surface (not just at the vertex points).
I have a couple of minor simplifications for the shader if you want.
Go ahead and switch the normal calculation to use the x/y axis, so it matches your wave_normal rust method.
One downside to using SystemParam is that the settings are global. In the future it would be useful to support different water surfaces (inland lakes/rivers vs open oceans) where the settings will be different.
Might need to change the shader to match your Rust
wave_normal: https://github.com/Neopallium/bevy_water/blob/e36732668e2077591a8b965bc6d81a66c363c3ed/src/param.rs#LL56C1-L56C1 Theoceanexample has debug lines (use featuredebug) that could be used to help check the Rust normals against the shader normals. I haven't found a good method to debug normals in shaders (can't just draw lines, need an extra shader/material).Afaik the usual method is to use the color output to store the normals, but I can't really make any sense of that when looking at it.
Yeah, I have tried that with a primitive mesher (generated mesh), but it isn't very easy to understand.
When I have time I will try creating something like the bevy wireframe system that can render normals as lines over the surface (not just at the vertex points).
Maybe a quick way to debug them is to use the DebugLines and draw them in a grid across each water quad entity, use the wave_normal method to get the end point for each line.
I also made the wave height modifiable.
I just noticed that large wave heights are somewhat ugly. This fixed by setting WATER_GRID_SIZE to 1. After fixing that the mesh has some artifacts when looking towards the sun in the ocean example.
I have a patch that uses the mesh generated by shape::Plane but didn't want to cram it in this PR.
I thought about the wave_height change. Maybe only setting it per Material it a better choice. If you want me to remove it just tell me.
I thought about the
wave_heightchange. Maybe only setting it per Material it a better choice. If you want me to remove it just tell me.
For now it is ok to keep it global. Using per material settings would make the Rust-side wave calculation difficult. Most likely will need to add a simple ray-cast system (Could render a low-res top-down view of just the water entities, the color would be the entity ID) to find the water entity below a 3d point.
The global settings and Rust wave_height can be limited to just one "ocean", any customized inland water will just not support the Rust height for now.
I also made the wave height modifiable.
Might be better to call it scale or amplitude.
I just noticed that large wave heights are somewhat ugly. This fixed by setting
WATER_GRID_SIZEto 1. After fixing that the mesh has some artifacts when looking towards the sun in the ocean example.
With wave_height: 5.0 the Rust-side wave_height() seems to be way off, but that might just be an amplification of the wave error I was seeing before.
I have a patch that uses the mesh generated by
shape::Planebut didn't want to cram it in this PR.
I am find with including the switch to using a Plane, it was added in Bevy 0.10.
The difference between the rust and shader wave code doesn't seem to be from the time. Pausing the global bevy time, still has a difference between the two. Even without the wave_height change.
I checked it with shape::UVSphere { radius: 0.1, ..default() } and wave_height = 4.0. Seems to be in the order of ~0.2 on my side, not too noticeable for larger meshes.
Finally found what is causing the difference between the shader & rust wave height. The mesh doesn't have many vertex points, the shader calculates the height at the vertex and the GPU smooths the height (frag_coord) between vertex points before calling the fragment shader. If we used very small triangles for the mesh the shader & Rust code would become very close. But that would hurt performance. It should be possible to use LOD meshes to help with this. With dense meshes close to the camera.
Changing the WATER_QUAD_SIZE to 1 improves it, but causes some weird breaks in the water surface. This seems to be a render issue. I think the PBR shader code doesn't like something about the generated height/normals.
I checked it with
shape::UVSphere { radius: 0.1, ..default() }andwave_height = 4.0. Seems to be in the order of ~0.2 on my side, not too noticeable for larger meshes.
Cool, I haven't tried the water on a sphere before.
I pushed a few more changes to the water shader. Just updating the y position in the fragment shader using the wave_height. I am not sure if this is better or not. Also maybe the frag_coord also needs to be updated too, but not sure if it can be done in the fragment shader.
I also plan on adding more PBR fields (copied from the StandardMaterial code). Opened issue #10 for that.
I thought about 2 more possible improvements for the normal calculation in the fragment shader.
- Nevermind ~~Sampling directly instead of using the interpolated value from the vertex shader. Can only see a difference when using the second change.~~
let height = get_wave_height(in.world_position.xz);
- Reducing the step size in the normal calculation. Lower values increase 'granularity'. Gives interesting results for different values.
let delta = 0.5;
let height_dx = get_wave_height(w_pos + vec2<f32>(delta, 0.0));
let height_dz = get_wave_height(w_pos + vec2<f32>(0.0, delta));
let normal = normalize(vec3<f32>(height - height_dx, delta, height - height_dz))
I pushed a few more changes to the water shader. Just updating the
yposition in the fragment shader using thewave_height. I am not sure if this is better or not. Also maybe thefrag_coordalso needs to be updated too, but not sure if it can be done in the fragment shader.
I think using height directly for the world y coordinate might break easily, for example if GlobalTransform is not trivial.
I think there is something wrong with the coordinate frames. For example we are using the normal in the local frame and using it for pbr which expects a normal in the global reference frame. Also the we are using the global position to calculate the normal (which probably should be calculated in the local frame).
I might make an example with how this fails when you modify Transform tomorrow.
I think there is something wrong with the coordinate frames. For example we are using the normal in the local frame and using it for pbr which expects a normal in the global reference frame. Also the we are using the global position to calculate the normal (which probably should be calculated in the local frame).
By "local frame" you mean local to the mesh? And "global frame" you mean the world?
The normals and wave height should all be done in the global (world) reference. If the wave and normals were calculated using the mesh's local reference, there would be hard seems/breaks between the water tiles.
The normals needs to be in the global reference. Normally the vertex shader converts the mesh/vertex normals to global before they are passed to the fragment shader.
I might make an example with how this fails when you modify
Transformtomorrow.
If you can provide an example or just bits of code, I will try to help debug it. There could still be some issues with the water shader or how it is passing info to the PBR shader.
Maybe this is caused by my last commits which updates the y position in the fragment shader. We can remove that if needed.
I pushed a few more changes to the water shader. Just updating the
yposition in the fragment shader using thewave_height. I am not sure if this is better or not. Also maybe thefrag_coordalso needs to be updated too, but not sure if it can be done in the fragment shader.I think using height directly for the world y coordinate might break easily, for example if
GlobalTransformis not trivial.
We can revert that change if needed. It didn't make a major improvement.
Maybe I just have the wrong idea about what the shader should do.
If you make it local you could do things like use it for spheres or cubes.
You would not lose anything and the current variables would just be integrated Transform.
It also behaves well with regards to Camera transforms, i.e. transforming the camera is equivalent to the inverse transform for the water tile.
I pushed a few more changes to the water shader. Just updating the
yposition in the fragment shader using thewave_height. I am not sure if this is better or not. Also maybe thefrag_coordalso needs to be updated too, but not sure if it can be done in the fragment shader.I think using height directly for the world y coordinate might break easily, for example if
GlobalTransformis not trivial.We can revert that change if needed. It didn't make a major improvement.
I have no idea how increased accuracy in the world position affects pbr rendering. Out of interest do you have any reference regarding pbr I could look at?
Maybe I just have the wrong idea about what the shader should do.
I had to research water shaders to create the original. I can't remember where the original Godot tutorial is that I first looked at, but here are two I just found that seem very good. They even have some ideas that can be used to improve the water shader.
https://www.youtube.com/watch?v=7L6ZUYj1hs8 https://www.youtube.com/watch?v=XjCh2cN3Mfg
Note they Godot's shader language is glsl, which is different then wgsl. Also each engine provides different globals (TIME vs bevy's globals.time).
If you make it local you could do things like use it for spheres or cubes. You would not lose anything and the current variables would just be integrated
Transform.
I haven't tried the shader on anything other than a flat plane. I will try it out.
Ah, thinking about it right now, it most likely is a problem with only applying the wave height to the y position (doesn't matter if it is local/global). For 3d shapes, need to apply the wave height in the direction of the vertex normal. Right now we are completely ignoring the mesh normals, since they are always pointing up (along the y-axis).
Something to try:
let world_normal = mesh_normal_local_to_world(vertex.normal);
out.world_position = world_position + (world_normal * height);
It also behaves well with regards to Camera transforms, i.e. transforming the camera is equivalent to the inverse transform for the water tile.
The water shader should be working ok with the Camera transforms.
Something to try:
let world_normal = mesh_normal_local_to_world(vertex.normal); out.world_position = world_position + (world_normal * height);
Most likely need to also pass the world_normal from the vertex shader to the fragment shader to be used when calculating the fine detail normals in the frag shader. The GPU will blend the values passed from the vertex->frag shader (useful for a sphere to smooth the normal over the surface, not sure how it will help cubes at the edges).
Ah, thinking about it right now, it most likely is a problem with only applying the wave height to the y position (doesn't matter if it is local/global). For 3d shapes, need to apply the wave height in the direction of the vertex normal. Right now we are completely ignoring the mesh normals, since they are always pointing up (along the y-axis).
Yeah that works.
I think I got it working for cubes, but there is still something missing for smooth surfaces. You generally need access to a local (about a point) tangent coordinate system, so you know what the local xz plane is.
Ah, thinking about it right now, it most likely is a problem with only applying the wave height to the y position (doesn't matter if it is local/global). For 3d shapes, need to apply the wave height in the direction of the vertex normal. Right now we are completely ignoring the mesh normals, since they are always pointing up (along the y-axis).
Yeah that works.
I think I got it working for cubes, but there is still something missing for smooth surfaces. You generally need access to a local (about a point) tangent coordinate system, so you know what the local xz plane is.
How are you getting the 2d position for the wave_height function for non-flat meshes? With a cube it should work kind of like the plane tiling, still might have issues at the corners.
I just remembered a Youtube video about procedural generating terrains for a sphere (planet). He had to use "Triplanar Mapping" to smoothly apply the terrain generator to the surface of the sphere. I think it was this series of videos: https://youtu.be/QN39W020LqU Also just found this one that looks interesting and also talks about "Triplanar mapping": https://www.youtube.com/watch?v=rNuDkDhadfU
I will need to watch those videos again. Might have remembered the details wrong.
For non-flat surfaces, we can either use a flag to enable something like the "Triplanar mapping", or provide two water shaders.
Other ideas:
- Use a texture to provide the 2d wave position and use the vertex UV coords to lookup the 2d point for the wave function. This would be a cheap way to bake in a mapping for 3d surfaces.
- Take multiple nearby wave samples and blend the height. A
get_wave_height_3d(pos1, pos2, pos3)which callsget_wave_height3 times.
I think it was this video that I had watched: https://www.youtube.com/watch?v=lctXaT9pxA0 The other series by the same person is for Unity. This video is more general.
Thanks, will have to take a look.
How are you getting the 2d position for the wave_height function for non-flat meshes? With a cube it should work kind of like the plane tiling, still might have issues at the corners.
I thought about using the generated tangents (vertex.tangent and bitangent) to somehow get the plane, but I think that might lead nowhere.
I thought it might be easy to implement, but I'm not so sure now. The whole idea doesn't look too useful to me anyway.