ARKit和Reality作曲家-如何使用图像坐标锚定场景

时间:2019-10-23 15:31:50

标签: ios swift xcode realitykit reality-composer

我已经编写了代码来初始化3个Reality Composer场景中的一个,具体取决于当月的某天按下按钮。

一切正常。

Reality Composer场景使用图像检测将对象放置在环境中,但是当前,一旦图像离开相机视图,这些对象就会消失。

我想将场景锚定为首先检测到图像的根节点,以便即使图像触发器不在相机视图中,用户也可以环视场景并维护对象。

我尝试在下面传递一个func渲染器代码,但是出现错误,指出视图控制器类没有.planeNode

 func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
            guard let imageAnchor = anchor as? ARImageAnchor else { return }
            let referenceImage = imageAnchor.referenceImage

                // Create a plane to visualize the initial position of the detected image.
                let plane = SCNPlane(width: referenceImage.physicalSize.width,
                                 height: referenceImage.physicalSize.height)
                plane.materials.first?.diffuse.contents = UIColor.blue.withAlphaComponent(0.20)
                self.planeNode = SCNNode(geometry: plane)

                self.planeNode?.opacity = 1

                /*
                 `SCNPlane` is vertically oriented in its local coordinate space, but
                 `ARImageAnchor` assumes the image is horizontal in its local space, so
                 rotate the plane to match.
                 */
                self.planeNode?.eulerAngles.x = -.pi / 2

                /*
                 Image anchors are not tracked after initial detection, so create an
                 animation that limits the duration for which the plane visualization appears.
                 */

                // Add the plane visualization to the scene.
                if let planeNode = self.planeNode {
                    node.addChildNode(planeNode)
                }

                if let imageName = referenceImage.name {
                    plane.materials = [SCNMaterial()]
                    plane.materials[0].diffuse.contents = UIImage(named: imageName)
                }

这是我的代码

import UIKit
import RealityKit
import ARKit
import SceneKit



class ViewController: UIViewController {



@IBOutlet var move: ARView!
    @IBOutlet var arView: ARView!

    var ARBorealAnchor3: ARboreal.ArBoreal3!

    var ARBorealAnchor2: ARboreal.ArBoreal2!

    var ARBorealAnchor: ARboreal.ArBoreal!

    var Date1 = 1




    override func viewDidLoad() {
        super.viewDidLoad()



        func getSingle() {
            let date = Date()
            let calendar = Calendar.current
            let day = calendar.component(.day, from: date)
            Date1 = day
        }

     getSingle()

      ARBorealAnchor = try! ARboreal.loadArBoreal()

        ARBorealAnchor2 = try!
        ARboreal.loadArBoreal2()

        ARBorealAnchor3 = try!
              ARboreal.loadArBoreal3()



        if Date1 == 24 {
            arView.scene.anchors.append(ARBorealAnchor)
        }
        if Date1 == 25 {
            arView.scene.anchors.append(ARBorealAnchor2)
        }
        if Date1 == 26 {
            arView.scene.anchors.append(ARBorealAnchor3)
        }
    }
}

任何帮助将不胜感激。

干杯, 丹尼尔·萨维奇

1 个答案:

答案 0 :(得分:0)

正在发生的事情是,当图像锚点超出视线范围时,AnchorEntity不再固定,然后RealityKit将停止渲染它及其所有后代。

解决此问题的一种方法是,将图像锚点和要渲染的内容分开,在代码中手动添加图像锚点,然后在第一次检测到图像锚点时,将内容添加到场景中的其他位置世界锚。更新图像锚点转换后,更新您的世界锚点以匹配。

这样,您可以在可见时使用图像锚点来获取最新的变换,但是当它消失时,内容的呈现就不会受到约束。如下所示(您必须创建一个名为ARTest的AR资源组,并在其中添加名为“ test”的图像,以使锚点正常工作):

import ARKit
import SwiftUI
import RealityKit
import Combine

struct ContentView : View {
    var body: some View {
        return ARViewContainer().edgesIgnoringSafeArea(.all)
    }
}

let arDelegate = SessionDelegate()

struct ARViewContainer: UIViewRepresentable {

  func makeUIView(context: Context) -> ARView {

    let arView = ARView(frame: .zero)

    arDelegate.set(arView: arView)
    arView.session.delegate = arDelegate

    // Create an image anchor, add it to the scene. We won't add any
    // rendering content to the anchor, it will be used only for detection
    let imageAnchor = AnchorEntity(.image(group: "ARTest", name: "test"))
    arView.scene.anchors.append(imageAnchor)

    return arView
  }

  func updateUIView(_ uiView: ARView, context: Context) {}
}

final class SessionDelegate: NSObject, ARSessionDelegate {
  var arView: ARView!
  var rootAnchor: AnchorEntity?

  func set(arView: ARView) {
    self.arView = arView
  }

  func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {

    // If we already added the content to render, ignore
    if rootAnchor != nil {
       return
    }

    // Make sure we are adding to an image anchor. Assuming only
    // one image anchor in the scene for brevity.
    guard anchors[0] is ARImageAnchor else {
      return
    }

    // Create the entity to render, could load from your experience file here
    // this will render at the center of the matched image
    rootAnchor = AnchorEntity(world: [0,0,0])
    let ball = ModelEntity(
      mesh: MeshResource.generateBox(size: 0.01),
      materials: [SimpleMaterial(color: .red, isMetallic: false)]
    )
    rootAnchor!.addChild(ball)

    // Just add another model to show how it remains in the scene even
    // when the tracking image is out of view.
    let ball2 = ModelEntity(
      mesh: MeshResource.generateBox(size: 0.10),
      materials: [SimpleMaterial(color: .orange, isMetallic: false)]
    )
    ball.addChild(ball2)
    ball2.position = [0, 0, 1]

    arView.scene.addAnchor(rootAnchor!)
  }

  func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
    guard let rootAnchor = rootAnchor else {
      return
    }

    // Code is assuming you only have one image anchor for brevity
    guard let imageAnchor = anchors[0] as? ARImageAnchor else {
      return
    }

    if !imageAnchor.isTracked {
      return
    }

    // Update our fixed anchor to image transform
    rootAnchor.transform = Transform(matrix: imageAnchor.transform)
  }

}

#if DEBUG
struct ContentView_Previews : PreviewProvider {
  static var previews: some View {
    ContentView()
  }
}
#endif

注意:当ARKit尝试计算准确的图像平面时,ARImageAnchor的变换似乎经常更新(例如,内容似乎在正确的位置,但是z值不正确) ,请确保您在AR资源组中的图像尺寸正确,以使图像获得更好的跟踪。