Makie.jl
Makie.jl copied to clipboard
Textured meshscatter
In response to https://discourse.julialang.org/t/3d-bars-makie-colors-and-meshes/70283.
This enables passing of textures and per instance/marker uv_scales to meshscatter with GLMakie. With that and the trick from #1368 you can do
using GLMakie, GeometryBasics, ColorSchemes
texture = reshape(get(colorschemes[:Spectral_11], 0:0.01:1), 1, 101)
rectMesh = FRect3D(Vec3f0(-0.5, -0.5, 0), Vec3f0(1, 1, 1))
recmesh = GeometryBasics.normal_mesh(rectMesh)
uvs = [Point2f(p[3], 0) for p in coordinates(recmesh)] # normalize this so zmax = 1
recmesh = GeometryBasics.Mesh(
meta(coordinates(recmesh); normals=normals(recmesh), uv = uvs),
faces(recmesh)
)
pos = [Point3f(i, j, 0) for i in 1:10 for j in 1:10]
z = rand(10,10)
fig = Figure(resolution=(1200, 800), fontsize=26)
ax = Axis3(fig[1, 1]; aspect=(1, 1, 1), elevation=π / 6, perspectiveness=0.5)
meshscatter!(
ax, pos, marker = recmesh, markersize = Vec3f.(1, 1, z[:]),
uv_scale = Vec2f.(1, z[:]), color = texture, shading = false
)
ax.limits[] = ((0, 11), (0, 11), (0, 1.2))
fig

don't forget this PR, it's a nice addition 😄
#1436 would also add this but it'll take me some time to figure what each shader/plot primitive should include
since https://github.com/JuliaPlots/Makie.jl/pull/1436 was close, any change of having this?
Hey, just wanted to bump this - I'm working on some plots using meshscatter and this would definitely be nice to have!
Does this work in RPRMakie out of the box by any chance? (or can it be made to?)
Hey, wanted to pick this up again. Is there anything stopping this from being merged?
I think this was in a working/mostly done state, so you probably just need to work out the merge conflicts.
bump
The branch should be up to date with master again.
It seems that something about coordinates(mesh) broke in GeometryBasics, so example now needs to be:
using GeometryBasics, ColorSchemes, GLMakie
texture = reshape(get(colorschemes[:Spectral_11], 0:0.01:1), 1, 101)
prim = FRect3D(Vec3f0(-0.5, -0.5, 0), Vec3f0(1, 1, 1))
rectmesh = GeometryBasics.normal_mesh(prim)
uvs = [Point2f(p[3], 0) for p in coordinates(rectmesh)] # normalize this so zmax = 1
rectmesh = GeometryBasics.Mesh(
meta(collect(coordinates(prim)); normals=normals(rectmesh), uv = uvs),
faces(rectmesh)
)
pos = [Point3f(i, j, 0) for i in 1:10 for j in 1:10]
z = rand(10,10)
fig = Figure(resolution=(1200, 800), fontsize=26)
ax = Axis3(fig[1, 1]; aspect=(1, 1, 1), elevation=π / 6, perspectiveness=0.5)
meshscatter!(
ax, pos, marker = rectmesh, markersize = Vec3f.(1, 1, z[:]),
uv_scale = Vec2f.(1, z[:]), color = texture, shading = false
)
ax.limits[] = ((0, 11), (0, 11), (0, 1.2))
fig
@SimonDanisch Do you think it would be a good idea to generalize uv_scale to a uv_transform::Mat3f? That would encode rotation + translation + scale just like a model matrix, but for uv coordinates.
Compile Times benchmark
Note, that these numbers may fluctuate on the CI servers, so take them with a grain of salt. All benchmark results are based on the mean time and negative percent mean faster than the base branch. Note, that GLMakie + WGLMakie run on an emulated GPU, so the runtime benchmark is much slower. Results are from running:
using_time = @ctime using Backend
# Compile time
create_time = @ctime fig = scatter(1:4; color=1:4, colormap=:turbo, markersize=20, visible=true)
display_time = @ctime Makie.colorbuffer(display(fig))
# Runtime
create_time = @benchmark fig = scatter(1:4; color=1:4, colormap=:turbo, markersize=20, visible=true)
display_time = @benchmark Makie.colorbuffer(display(fig))
| using | create | display | create | display | |
|---|---|---|---|---|---|
| GLMakie | 4.35s (4.31, 4.37) 0.02+- | 105.42ms (103.33, 109.95) 2.38+- | 538.58ms (531.60, 544.01) 4.94+- | 8.93ms (8.80, 9.05) 0.11+- | 25.76ms (25.57, 25.98) 0.15+- |
| master | 4.34s (4.31, 4.41) 0.04+- | 105.28ms (103.27, 107.98) 1.49+- | 537.01ms (531.16, 552.18) 7.36+- | 8.89ms (8.79, 8.97) 0.06+- | 25.79ms (25.68, 26.01) 0.14+- |
| evaluation | 1.00x invariant, 0.0s (0.08d, 0.88p, 0.03std) | 1.00x invariant, 0.14ms (0.07d, 0.90p, 1.94std) | 1.00x invariant, 1.57ms (0.25d, 0.65p, 6.15std) | 1.00x invariant, 0.04ms (0.46d, 0.41p, 0.09std) | 1.00x invariant, -0.03ms (-0.18d, 0.74p, 0.14std) |
| CairoMakie | 3.96s (3.95, 3.99) 0.01+- | 107.69ms (105.87, 109.22) 1.24+- | 132.82ms (130.67, 136.46) 2.02+- | 8.92ms (8.37, 9.17) 0.27+- | 979.43μs (969.72, 986.73) 5.62+- |
| master | 3.95s (3.92, 3.99) 0.03+- | 107.77ms (106.29, 109.44) 1.11+- | 132.54ms (131.51, 134.86) 1.15+- | 9.17ms (8.86, 9.32) 0.15+- | 987.13μs (975.94, 1003.54) 9.43+- |
| evaluation | 1.00x invariant, 0.01s (0.64d, 0.26p, 0.02std) | 1.00x invariant, -0.07ms (-0.06d, 0.91p, 1.18std) | 1.00x invariant, 0.28ms (0.17d, 0.76p, 1.59std) | 1.03x invariant, -0.25ms (-1.14d, 0.06p, 0.21std) | 1.01x invariant, -7.7μs (-0.99d, 0.09p, 7.52std) |
| WGLMakie | 4.54s (4.51, 4.59) 0.03+- | 106.87ms (104.24, 116.69) 4.37+- | 9.07s (8.99, 9.17) 0.07+- | 9.49ms (9.18, 10.54) 0.48+- | 71.83ms (71.14, 72.62) 0.51+- |
| master | 4.54s (4.51, 4.61) 0.03+- | 106.16ms (104.97, 107.40) 0.89+- | 9.07s (9.03, 9.10) 0.03+- | 9.94ms (9.26, 10.98) 0.76+- | 70.60ms (69.88, 71.61) 0.66+- |
| evaluation | 1.00x invariant, 0.0s (0.16d, 0.77p, 0.03std) | 0.99x invariant, 0.71ms (0.23d, 0.69p, 2.63std) | 1.00x invariant, 0.01s (0.09d, 0.87p, 0.05std) | 1.05x invariant, -0.45ms (-0.71d, 0.21p, 0.62std) | 0.98x slower X, 1.23ms (2.08d, 0.00p, 0.59std) |
consider allowing rotation/generalizing to Mat{2, 3, Float32} and encoding the uv = (1.0 - uv.y, uv.x) here
@SimonDanisch What do you think about this? I.e. replacing the o_uv = vec2(1.0 - tex_uv.y, tex_uv.x); we have in a couple of shaders with
# attribute default
uv_transform = Mat{2, 3, Float32}(0, 1.0, -1.0, 0.0, 1.0, 0.0)
# 2×3 StaticArraysCore.SMatrix{2, 3, Float32, 6} with indices SOneTo(2)×SOneTo(3):
# 0.0 -1.0 1.0
# 1.0 0.0 0.0
# shader
o_uv = uv_transform * vec3(tex_uv, 1.0);
I thought about adding uv_remap=uv->reverse(uv) , since that's more flexible and I don't think we have a high need to run this on the shader?
It could also have a few predefined ones like (uv_remap=flipx) or so
Ah I guess you'd need this to upload them per instance for meshscatter... We could allow function and matrix?
The current version of this is effectively the same as uv_offset_width in scatter/text. Some form of this is necessary if you want to do texture mapping with meshscatter like we do for text, e.g. for Tyler. For surface which generates uv's in the shader it'd also be necessary if you want to manipulate the uvs. For mesh... maybe there is a use case in animated textures?
I don't really get the point of passing a Function. When would you ever want to do more than a 2x3 Matrix allows for (which allows scaling, translation, rotation, mirroring, swapping axes, etc)? This isn't particularly expensive (negligible memory since it's per instance or const, negligible performance difference with 10 ops for Mat23 vs 7ish ops current vs 5ish ops on master per vertex shader run. (uv_scale already exists))
When would you ever want to do more than a 2x3 Matrix allows for
I don't really, but it's much easier to flip a coordinate or reverse it than constructing the corresponding matrix.
So it's a usability feature to have e.g. uv_mapping=reverse.
I'm open to other solutions, but I'd love to have a solution that is both generic and user friendly.
I'm happy to have an enum for the most common operations that either get converted to a matrix or get handled in the shader directly (could be more performant to flip something in the shader, or do no transformation at all, than always multiplying my a 3x2 matrix).
I think worrying about performance of a matrix multiplication is one of this early optimization traps. With SIMD if enum == option1 ... has a tough battle to fight...
I'm also don't think this is a particularly user-facing feature. Seems more like something a power user may rarely need, at least if we handle defaults well. But I also think we can make this fairly simple by just providing some good functions for generating matrices. Or maybe making UVTransform just magical enough (e.g. pre-applying permutations and mirroring).
Also the current solutions to fixing issues with uvs/textures are still just as accessible (manipulating the image or the mesh)
With SIMD if enum == option1 ... has a tough battle to fight...
I thought we'd just {{interpolate}} it into the shader like we do with buffers etc, so that we don't always have to pay the overhead.
Our meshscatter is in fact already quite slow, so I'm happy to add the shader interpolation to when performance problems actually come up which may then need some more tricks anyways.
I'm also don't think this is a particularly user-facing feature.
I thought you were thinking to add this to other plot functions? E.g. flipping images is quite common: image(img; uv_transform=flipx)`. Although, for image i probably wouldn't like to call it uv_transform...?
Anyways, as I said I'm quite happy to have other solutions, e.g. making UVTransform more magical ;)
I'd add aliases though for the most common operations, just like the enum ;)
If this is ballooning this PR, I'm happy to cut down on usability for now, and just keep those things in mind to be added another time.
If we do switch to meshscatter for Tyler, I highly recommend keeping the current per-tile placing API - I would need individual meshes with a dense tessellation for Tyler to work seamlessly with GeoMakie transforms (which are not translateable to GPU).
@asinghvi17 this should be discussed at Tyler.jl?
I was planning on mesh and surface because they already had uv_scale. I didn't plan for heatmap and image, but since image relies on the mesh pipeline (I think?) I guess I should add it there too. Shouldn't be much work
In terms of bloating I'm more worried about WGLMakie because that doesn't allow textured meshscatter as far as I can tell.
Fair, will open an issue over there to track whether we want the change or not.
As for UV modification, I agree that complex changes (setting UV based on some vertex computation for example) should be done by directly changing the mesh UVs, and we can point to that in the docs.
I can see some value in having direct functions or enums - I could use UV remappers in GeoMakie's meshimage recipe, to preserve transformation fidelity/locality when zooming in (at least in 2D axes). I'd only need offset + scale for that, though.
The reversing / flipping usecase is also pretty useful if you have a super large image (say 5000x10000) and want to plot that as an image without issues. The current approach means you have to rotr90 such an image, which is already in memory and thus might actually overload your RAM if you do this for a sufficiently large image. You could modify mesh UVs yourself, which is not an end user thing, to display it correctly in Makie.
That being said, some of my usecases do have ~300,000 to ~1,000,000 vertices, so I'm not sure how inefficient it would be, relatively, to go with simple matrix multiplication, as opposed to some enum (or subtype + interface) approach. The advantage of a subtype + interface thing is that you could theoretically encode arbitrarily complex transformations so long as they were written in OpenGL - not sure if that's a thing we want to support, but that is a possibility.
@ffreyer I found a few discourse posts indicating that you could use a shader-level UV modification, same as OpenGL, to be able to do this in Three.js:
https://discourse.threejs.org/t/how-to-change-texture-per-instance-in-instancedmesh/36304 https://discourse.threejs.org/t/how-to-set-different-textures-on-each-instancedmesh/29433 https://discourse.threejs.org/t/how-to-apply-offsets-for-texture-atlas-in-instancedmesh/33191/4 https://jsfiddle.net/prisoner849/g2cpv675/
There are a lot more accessible with a quick google, this was just the few I could find which seemed reasonable to me.
ahh... for this one, I think the most useful thing I was thinking back then was to do a 3d Colorbar for surface plots using RPRMakie :D, now is so much more. Hopefully now some version of this will land in master 😄 .
Maybe this is also a good point to reassess the default orientation of images in plots. Calling image(cow) and equivalent we have two groups on master:
Image, Heatmap and Surface (in GLMakie) produce:
And mesh, meshscatter (with a Rect2f) produce
Should I just reproduce these or do we want to change them?
Should I just reproduce these or do we want to change them?
Lets reproduce them for now so we can get this in the next non breaking release, and then make a small PR for the next breaking release to change the orientation :)
I dropped rotations from uv_transform because it's not clear to me how to handle them. For 0..1 uvs you'd need to translate them to -0.5..0.5, then rotate, then translate back to rotate the texture. For patterns you'd want them to just rotate. If you think about uvs directly, or about combining operations as uv_transform(op2) * uv_transform(op1) it also makes more sense for it to just rotate. So the common use case is rather at odds with all the less common ones.
I also added a refimg now, testing a couple of transforms for mesh, surface, meshscatter and image:
| GLMakie Previous Version | GLMakie Surface Rotation fix |
|---|---|
| CairoMakie | WGLMakie |
These are:
| mesh | surface |
|---|---|
| meshscatter | image |
Notes:
- CairoMakie still uses nearest interpolation to determine vertex colors from textures which is why the bottom left example is off for everything but image
WGLMakie should be working now too.
For per-element uv_transforms in meshscatter I added the piece-by-piece cow as a refimg: