sceneform-android-sdk
sceneform-android-sdk copied to clipboard
Feature request: perfectly render a polygon on a Plane
I'm working on rendering a grid (or any Renderable) that fits perfectly to the edges of a plane. This is how it would work:
- Scan a surface using Sceneform API. By default, Sceneform is able to find the edges of a surface. In other words, the edge of a tabletop can easily be found. When the scanning spotlight is cut short due to an edge, it is guaranteed an edge is found. However, it is always true that a plane has boundaries. The "edge" in this case is the physical edge of, say, a tabletop.
- Invoke a procedure to render something that fits perfectly within the boundaries on the plane.
As of now, I think an implementation would possible by using RenderableDefinition
.
I think it would be a nice feature to build a scene that reflects the world to a better extent.
I have worked on how to build the RenderableDefinition
. However, in getPlaneVertices(Plane plane)
I have some issues. As of now, I use some magical constants. My question is this: how do I dynamically set the position and the normal? And what about the orientation of planeRenderable
?
/* Invoked when a specific plane is tapped */
fragment.setOnTapArPlaneListener(
(HitResult hitResult, Plane plane, MotionEvent motionEvent) -> {
// List of vertices that make up the polygon
List<Vertex> planeVertexList = getPlaneVertices(plane);
// Dynamically build a new polygon to be rendered on the plane
RenderableDefinition renderableDefinition = builder().setVertices(planeVertexList).build();
ModelRenderable.builder()
.setSource(renderableDefinition)
.build()
.thenAccept(
modelRenderable -> {
planeRenderable = modelRenderable.makeCopy();
// set material as well
})
.thenRun(
() -> {
// render planeRenderable here
})
.exceptionally(
throwable -> {
throw new CompletionException(throwable);
});
});
}
/* Returns a list of vertices that makes up the plane */
private List<Vertex> getPlaneVertices(Plane plane) {
List<Vertex> list = new ArrayList<Vertex>();
FloatBuffer planeFloatBuffer = plane.getPolygon();
for (int i = 0; i < planeFloatBuffer.remaining() - 1; i += 2) {
Vertex.UvCoordinate uvCoordinate = new Vertex.UvCoordinate(planeFloatBuffer.get(i), planeFloatBuffer.get(i + 1));
Vertex vertex = Vertex.builder()
// magical constants here
.setPosition(new Vector3(0f, 0.f, 0.f))
.setNormal(new Vector3(0.5f, 0.5f, 0.5f))
.setUvCoordinate(uvCoordinate)
.build();
list.add(vertex);
}
return list;
}
In essence, this procedure would render something on the surface the scanning spotlight hovered. Any guidance would be very much appreciated!
Position is needed if you say always wanted the Renderable to be offset from the node. Since you are specifying the verticies it's unlikely you would need both. It's more for the case where you are creating a cube and want to offset the cube to be on, above or below the surface.
Normal is just a direction perpendicular to the triangle or surface. Since you are creating a plane all the normals will be the same. It's used to get shading for the object.
The normal can be calculated by getting the cross product of a the vector between a few points in your polygon...perhaps this answer will be helpful https://stackoverflow.com/questions/22838071/robust-polygon-normal-calculation
One other note. In general our textures for planes tend to blur at the edges. This is because ArCore is always finding more info about the scene. The initial plane it returns will not cover the whole table, but it will get better with time. Hard edges on the polygon will imply that the scene is finished scanning and might not be what you want.
Hi @malik-at-work ,
Is there a way to show all the plane discovered / detected by arcore. Right now only focus area shows the white dots. But for my use case I want to show all the area detected.