ExampleOfiOSLiDAR
ExampleOfiOSLiDAR copied to clipboard
How to scan all geometry with texture?
it only seems to be applying to one frame and one part of the mesh not all.
i.e.
I also found this problem. I scanned a large area. When the texture map was finally displayed, only the texture map of the current frame was displayed normally, and the content beyond the current frame of the picture would have an edge texture trailing effect. I suspect it is texture. The problem of coordinate calculation. I checked what I could find on the Internet, but there was no solution to this problem.
I tried to display the realistic texture map by default, which looks okay when running, but the exported .usdz file will also have the above-mentioned edge texture trailing effect.
code show as below:
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
guard scanMode == .noneed else {
return nil
}
guard let anchor = anchor as? ARMeshAnchor else { return nil }
let node = SCNNode()
let geometry = scanGeometory(anchor: anchor, node: node)
node.geometry = geometry
return node
}
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
guard scanMode == .noneed else {
return
}
guard let anchor = anchor as? ARMeshAnchor else { return }
let geometry = self.scanGeometory(anchor: anchor, node: node)
node.geometry = geometry
}
func scanGeometory(anchor: ARMeshAnchor, node: SCNNode) -> SCNGeometry {
let frame = sceneView.session.currentFrame!
let geometry = SCNGeometry(geometry: anchor.geometry, modelMatrix: anchor.transform, textureCoordinates: anchor.geometry.calcTextureCoordinates(camera: frame.camera, modelMatrix: anchor.transform)!)
if let image = captureCamera() {
geometry.firstMaterial?.diffuse.contents = image
}
node.geometry = geometry
return geometry
}
func exportUSDE() {
let fileName = "Mesh" + UUID().uuidString
let documentDirURL = try! FileManager.default.url(for: .documentDirectory,
in: .userDomainMask,
appropriateFor: nil,
create: true)
let fileURL = documentDirURL.appendingPathComponent(fileName).appendingPathExtension("usdz")
self.sceneView.scene.write(to: fileURL, options: nil, delegate: nil, progressHandler: nil)
let activityVc = UIActivityViewController(activityItems: [fileURL], applicationActivities: nil)
DispatchQueue.main.async {
activityVc.popoverPresentationController?.sourceView = UIView(frame: CGRect(x: 100, y: 100, width: 66, height: 88))
self.present(activityVc, animated: true)
}
}
The effect is as follows:
The issue occurs when updating the nodes (from the didUpdate)
The issue occurs when updating the nodes (from the didUpdate)
Hello, I still don’t know how to solve this problem. Are there any examples?
Did anyone solve this? I see what you mean bronzy about it not updating the view when there is a didUpdate, but I am not sure how to refresh the object texture. I can't seem to see the mesh texture being saved at all, and seems to be only displaying live at the time of render. Can anyone confirm this? Is there a way to save the texture from the camera so it can be recalled?
Went through the code and this isnt an issue of texture mapping. The texture mapping works as expected. Looks like the didUpdate throws nodes that are not in the frame too hence resulting in this trailing edge effect. One way to solve this is to get the nodes that are within the frame and apply texture there (Dont know how to do that).
Let's open a discussion and get this done with. Its been too long that everyone is trying to find an open source solution for the same!
I went ahead and did some experiments and this code did a little better. Not useable though. Please have a look
`// // ScanViewController.swift // ExampleOfiOSLiDAR // // Created by TokyoYoshida on 2021/02/10. //
import RealityKit import ARKit
class LabelScene: SKScene { let label = SKLabelNode() var onTapped: (() -> Void)? = nil
override public init(size: CGSize){
super.init(size: size)
self.scaleMode = SKSceneScaleMode.resizeFill
label.fontSize = 65
label.fontColor = .blue
label.position = CGPoint(x:frame.midX, y: label.frame.size.height + 50)
self.addChild(label)
}
required init?(coder aDecoder: NSCoder) {
fatalError("Not been implemented")
}
convenience init(size: CGSize, onTapped: @escaping () -> Void) {
self.init(size: size)
self.onTapped = onTapped
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let onTapped = self.onTapped {
onTapped()
}
}
func setText(text: String) {
label.text = text
}
} class ScanViewController: UIViewController, ARSCNViewDelegate, ARSessionDelegate { enum ScanMode { case noneed case doing case done }
@IBOutlet weak var sceneView: ARSCNView!
var scanMode: ScanMode = .noneed
var originalSource: Any? = nil
let globalTestNode = SCNNode()
lazy var label = LabelScene(size:sceneView.bounds.size) { [weak self] in
self?.rotateMode()
}
override func viewDidLoad() {
func setARViewOptions() {
sceneView.scene = SCNScene()
}
func buildConfigure() -> ARWorldTrackingConfiguration {
let configuration = ARWorldTrackingConfiguration()
configuration.environmentTexturing = .automatic
configuration.sceneReconstruction = .mesh
if type(of: configuration).supportsFrameSemantics(.sceneDepth) {
configuration.frameSemantics = .smoothedSceneDepth
}
return configuration
}
func setControls() {
label.setText(text: "Scan")
sceneView.overlaySKScene = label
}
super.viewDidLoad()
sceneView.delegate = self
sceneView.session.delegate = self
setARViewOptions()
let configuration = buildConfigure()
sceneView.session.run(configuration)
setControls()
}
func rotateMode() {
switch self.scanMode {
case .noneed:
self.scanMode = .doing
label.setText(text: "Reset")
originalSource = sceneView.scene.background.contents
sceneView.scene.background.contents = UIColor.black
case .doing:
break
case .done:
scanAllGeometry(needTexture: false)
self.scanMode = .noneed
label.setText(text: "Scan")
sceneView.scene.background.contents = originalSource
}
}
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
guard scanMode == .noneed else {
return nil
}
guard let anchor = anchor as? ARMeshAnchor ,
let frame = sceneView.session.currentFrame else { return nil }
let node = SCNNode()
guard let frame = self.sceneView.session.currentFrame else { return node}
guard let anchor = anchor as? ARMeshAnchor else { return node}
if(self.nodeIsWithinFrame(node: node)) {
guard let cameraImage = captureCamera() else {return node}
let geometry = self.scanGeometory(frame: frame, anchor: anchor, node: node, needTexture: true, cameraImage: cameraImage)
node.geometry = geometry
globalTestNode.addChildNode(node)
}
return node
}
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
guard scanMode == .noneed else {
return
}
guard let frame = self.sceneView.session.currentFrame else { return }
guard let anchor = anchor as? ARMeshAnchor else { return }
if(self.nodeIsWithinFrame(node: node)) {
guard let cameraImage = captureCamera() else {return}
let geometry = self.scanGeometory(frame: frame, anchor: anchor, node: node, needTexture: true, cameraImage: cameraImage)
node.geometry = geometry
globalTestNode.addChildNode(node)
}
}
func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
if (self.scanMode == .doing) {
self.scanAllGeometry(needTexture: true)
self.scanMode = .done
}
}
func nodeIsWithinFrame(node: SCNNode) -> Bool {
if let pointOfView = sceneView.pointOfView {
let position = sceneView.projectPoint(node.worldPosition)
var bounds = UIScreen.main.bounds
var mBounds = CGRect(x: bounds.midX, y: bounds.midY, width: bounds.width, height: bounds.height)
if mBounds.contains(CGPoint(x: CGFloat(position.x), y: bounds.size.height - CGFloat(position.y))) {
return true
}
}
return false
}
func scanGeometory(frame: ARFrame, anchor: ARMeshAnchor, node: SCNNode, needTexture: Bool = false, cameraImage: UIImage? = nil) -> SCNGeometry {
let camera = frame.camera
let geometry = SCNGeometry(geometry: anchor.geometry, camera: camera, modelMatrix: anchor.transform, needTexture: needTexture)
if let image = cameraImage, needTexture {
geometry.firstMaterial?.diffuse.contents = image
} else {
geometry.firstMaterial?.diffuse.contents = UIColor(red: 0.5, green: 1.0, blue: 0.0, alpha: 0.7)
}
node.geometry = geometry
return geometry
}
func scanAllGeometry(needTexture: Bool) {
self.sceneView.scene.rootNode.addChildNode(globalTestNode)
return
guard let frame = sceneView.session.currentFrame else { return }
guard let cameraImage = captureCamera() else {return}
guard let anchors = sceneView.session.currentFrame?.anchors else { return }
let meshAnchors = anchors.compactMap { $0 as? ARMeshAnchor}
for anchor in meshAnchors {
guard let node = sceneView.node(for: anchor) else { continue }
let geometry = scanGeometory(frame: frame, anchor: anchor, node: node, needTexture: needTexture, cameraImage: cameraImage)
node.geometry = geometry
}
}
func captureCamera() -> UIImage? {
guard let frame = sceneView.session.currentFrame else {return nil}
let pixelBuffer = frame.capturedImage
let image = CIImage(cvPixelBuffer: pixelBuffer)
let context = CIContext(options:nil)
guard let cameraImage = context.createCGImage(image, from: image.extent) else {return nil}
return UIImage(cgImage: cameraImage)
}
}
extension UIColor { static var random: UIColor { return UIColor( red: .random(in: 0...1), green: .random(in: 0...1), blue: .random(in: 0...1), alpha: 1.0 ) } } `
Also on apple's developer's forums I found this
If you want to color the mesh based on the camera feed, you could do so manually, for example by unprojecting the pixels of the camera image into 3D space and color the according mesh face with the pixel's color. However, keep in mind that ARMeshAnchors are constantly updated. So you might want to first scan the entire area you're interested in, then stop scene reconstruction, and do the coloring in a subsequent step.
Hi,DId you solve this proble? @jaswant-iotric
Hi, is there someone find a solution for this issue.