SAPIEN
SAPIEN copied to clipboard
Does sapien support generate mesh with texture of an actor?
Hi I'm trying to get mesh with texture after loading urdf object in SAPIEN. I find in docs that we could generate point clouds. Does SAPIEN support generating mesh with textures? Thanks!
I am not sure what you mean by generating. If you mean getting or exporting mesh and texture, there is partial support in the dev branch and you need to build from source. To get mesh with texture, you need vertices, normals, and uvs for the mesh and the texture itself. The mesh properties are exposed very recently here. Texture interface is here and here. Texture read is not fully implemented yet, but if the texture is created from a file, you can get the filename. All these features are in dev and not stable right now and if you choose to build sapien from source, make sure to report any issues you encounter with the new features.
Is there now an easier way of exporting a mesh?
I plan to adjust the lower and upper limits in the urdf file. Next, I want to export the edited mesh. I believe I may have found a way to get the vertices, normals, and uvs. However, the constructor is not defined. https://sapien.ucsd.edu/docs/2.0/apidoc/sapien.core.html#sapien.core.pysapien.RenderGeometry
I do not get how limits are related to meshes. They are completely different things and should be handled independently. But if you want to export meshes, you can use external libraries such as trimesh.
I am having troubling trying to get a reference to the mesh. I was reviewing the camera.py file, and it appeared the mesh just pop up in the scene. To export it, I need a way of referencing it
You can construct an external mesh object such as trimesh.Mesh from the vertices and faces you read from SAPIEN. You have mentioned you are able to get the vertices, so you should have the things you need. Again I am not sure why you need to export a mesh. SAPIEN is a simulation software so its purpose is really loading meshes for simulation instead of doing further mesh processing.
I dont have access to the vertices. I was trying to get them through the RenderGeometry class, but there isnt a constructor or a way of making an instance of it.
I see what you want here. If you load the URDF file into SAPIEN, you can access the render bodies of individual links. And the RenderGeometry will be attached to those render bodies. In SAPIEN you cannot create geometries. You should load them directly from file to create actors and articulations.
I apologize, but I do not see how I can get the RenderGeometry. In the documentations, I can not find any classes or methods that reference or return an instance of RenderGeometry.
I can see that the links of the mesh are of the class KinematicLink, but I do not see how this will help me. Perhaps, I am looking at the wrong individual links. https://sapien.ucsd.edu/docs/2.0/apidoc/sapien.core.html#sapien.core.pysapien.KinematicLink
KinematicLink inherits the ActorBase class. And here is the method https://sapien.ucsd.edu/docs/2.0/apidoc/sapien.core.html#sapien.core.pysapien.ActorBase.get_visual_bodies
However, I still cannot understand your use case. If you can describe what you are trying to achieve more clearly, I may be able to give suggestions. For example, if you are trying to use SAPIEN as a URDF mesh loader, maybe you should consider a better suited library such as urdfpy.
I will tryout the linked method tomorrow. We just want to use the partnet dataset from sapien. However, the downloaded meshes are all in a particular state e.g. “closed oven”. We also wanted the mesh of an “opened oven” which required editing the urdf, rerendering, and exporting the mesh.
I see. In that case, you should note that the vertices of RenderGeometry will always be the same no matter what angles you give it, since SAPIEN uses meshes in the local frame. To get the desired "combined mesh" in the global frame, you need to read the poses of links in SAPIEN, and then apply the corresponding transformations to the the vertices. (You can get the vertices from SAPIEN, or by using some other geometry processing libraries)
Hi @llamcpp, I am also interested in extracting a mesh for the "open state" of PartNet Mobility objects from SAPIEN. Were you able to figure out how to do that?
No, I am not that good with coding and did not figure it out. Additionally, the direction of my project changed so downloading the meshes was not needed anymore
Okay, got it! @fbxiang do you have a pointer to an example of the approach that you described? i.e. apply the corresponding transformations of links to the vertices in a PartNet mesh. In theory this has to happen somewhere in the SAPIEN code to get the final articulated rendering, right? Thank you!
Here is a use case where the render shapes of a link are converted to to a mesh in world space. https://github.com/haosulab/ManiSkill/blob/a3065dda38c01a1cfb9688b12060e85c22b4fe8b/mani_skill/env/open_cabinet_door_drawer.py#L75-L83 Note if you are using latest SAPIEN built from source (dev branch), there are API changes here. The changes are described in the change log in readme.
Thanks, this pointer was super helpful! I ended up just using the function get_mesh_for_art from this code snippet (very similar to what you linked) after applying the motion to my object of interest, and was able to very easily write out the mesh of the articulated object. Thanks again!
@jazcollins I am trying to extract a mesh with appropriate textures that I can save as .obj file after performing random articulations on a shape. Do you know how to go about this? Or does sapien already have a function that saves the mesh from the scene?
Hi @arnabdeypolimi,
The function get_mesh_for_art that I linked above can let you extract the mesh and save it as an .obj file after performing a random articulation. As far as I know the meshes are untextured though, so you may need to figure out to retain that information from the original mesh.