我已经扫描并训练了多个现实世界的物体。我确实有ARReferenceObject
,应用程序检测到它们正常。
我面临的问题是,当一个对象不具有<em>独特,充满活力的特征时,它需要几秒钟的时间才能返回检测结果,据我所知。现在,我希望该应用在尝试检测该对象时在其顶部显示一个边界框和一个活动指示器。
我没有有关此的任何信息。另外,如果有任何方法可以获取开始检测的时间或被检测对象的置信度百分比。
感谢您的帮助。
答案 0 :(得分:7)
可以在检测到boundingBox
之前显示ARReferenceObject
;尽管我不确定您为什么要(无论如何要提前这样做)。
例如,假设您的referenceObject在水平表面上,则首先需要将估计的边界框放置在平面上(或使用其他方法提前放置它),并在检测ARPlaneAnchor上花费时间并放置boundingBox,很可能已经检测到您的模型。
可能的方法:
毫无疑问,您知道ARReferenceObject
具有center
,extent
和scale
属性以及与该对象关联的一组rawFeaturePoints
这样,我们可以基于Scanning & Detecting 3D Objects中来自Apple的一些示例代码创建我们自己的boundingBox节点,并创建我们自己的SCNNode,该SCNNode将显示一个ARReferenceObject
大小近似的边界框,在检测到之前先存储在本地。
请注意,您需要从Apple示例代码中找到“ wireframe_shader”,使boundingBox呈现透明:
import Foundation
import ARKit
import SceneKit
class BlackMirrorzBoundingBox: SCNNode {
//-----------------------
// MARK: - Initialization
//-----------------------
/// Creates A WireFrame Bounding Box From The Data Retrieved From The ARReferenceObject
///
/// - Parameters:
/// - points: [float3]
/// - scale: CGFloat
/// - color: UIColor
init(points: [float3], scale: CGFloat, color: UIColor = .cyan) {
super.init()
var localMin = float3(Float.greatestFiniteMagnitude)
var localMax = float3(-Float.greatestFiniteMagnitude)
for point in points {
localMin = min(localMin, point)
localMax = max(localMax, point)
}
self.simdPosition += (localMax + localMin) / 2
let extent = localMax - localMin
let wireFrame = SCNNode()
let box = SCNBox(width: CGFloat(extent.x), height: CGFloat(extent.y), length: CGFloat(extent.z), chamferRadius: 0)
box.firstMaterial?.diffuse.contents = color
box.firstMaterial?.isDoubleSided = true
wireFrame.geometry = box
setupShaderOnGeometry(box)
self.addChildNode(wireFrame)
}
required init?(coder aDecoder: NSCoder) { fatalError("init(coder:) Has Not Been Implemented") }
//----------------
// MARK: - Shaders
//----------------
/// Sets A Shader To Render The Cube As A Wireframe
///
/// - Parameter geometry: SCNBox
func setupShaderOnGeometry(_ geometry: SCNBox) {
guard let path = Bundle.main.path(forResource: "wireframe_shader", ofType: "metal", inDirectory: "art.scnassets"),
let shader = try? String(contentsOfFile: path, encoding: .utf8) else {
return
}
geometry.firstMaterial?.shaderModifiers = [.surface: shader]
}
}
要显示边界框,您将执行以下操作,请注意,在我的示例中,我具有以下变量:
@IBOutlet var augmentedRealityView: ARSCNView!
let configuration = ARWorldTrackingConfiguration()
let augmentedRealitySession = ARSession()
要在检测到实际对象本身之前显示boundingBox,可以在func
中调用loadBoundigBox
viewDidLoad
,例如:
/// Creates A Bounding Box From The Data Available From The ARObject In The Local Bundle
func loadBoundingBox(){
//1. Run Our Session
augmentedRealityView.session = augmentedRealitySession
augmentedRealityView.delegate = self
//2. Load A Single ARReferenceObject From The Main Bundle
if let objectURL = Bundle.main.url(forResource: "fox", withExtension: ".arobject"){
do{
var referenceObjects = [ARReferenceObject]()
let object = try ARReferenceObject(archiveURL: objectURL)
//3. Log it's Properties
print("""
Object Center = \(object.center)
Object Extent = \(object.extent)
Object Scale = \(object.scale)
""")
//4. Get It's Scale
let scale = CGFloat(object.scale.x)
//5. Create A Bounding Box
let boundingBoxNode = BlackMirrorzBoundingBox(points: object.rawFeaturePoints.points, scale: scale)
//6. Add It To The ARSCNView
self.augmentedRealityView.scene.rootNode.addChildNode(boundingBoxNode)
//7. Position It 0.5m Away From The Camera
boundingBoxNode.position = SCNVector3(0, -0.5, -0.5)
//8. Add It To The Configuration
referenceObjects.append(object)
configuration.detectionObjects = Set(referenceObjects)
}catch{
print(error)
}
}
//9. Run The Session
augmentedRealitySession.run(configuration, options: [.resetTracking, .removeExistingAnchors])
augmentedRealityView.automaticallyUpdatesLighting = true
}
上面的示例简单地从未检测到的ARReferenceObject
创建了一个boundingBox并将其放置在Camera
下方0.5m处和0.5米处,这是这样的:
您当然首先需要处理boundBox的位置,还需要处理来删除boundingBox'indicator'。
当检测到实际对象时,下面的方法仅显示一个boundBox:
//--------------------------
// MARK: - ARSCNViewDelegate
//--------------------------
extension ViewController: ARSCNViewDelegate{
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
//1. Check We Have A Valid ARObject Anchor
guard let objectAnchor = anchor as? ARObjectAnchor else { return }
//2. Create A Bounding Box Around Our Object
let scale = CGFloat(objectAnchor.referenceObject.scale.x)
let boundingBoxNode = BlackMirrorzBoundingBox(points: objectAnchor.referenceObject.rawFeaturePoints.points, scale: scale)
node.addChildNode(boundingBoxNode)
}
}
产生如下内容:
关于检测计时器,Apple示例代码中有一个示例,其中显示了检测模型需要多长时间。
以最粗略的形式(不计毫秒),您可以执行以下操作:
首先创建一个Timer
和一个var
来存储检测时间,例如:
var detectionTimer = Timer()
var detectionTime: Int = 0
然后,当您运行ARSessionConfiguration
时,初始化计时器,例如:
/// Starts The Detection Timer
func startDetectionTimer(){
detectionTimer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(logDetectionTime), userInfo: nil, repeats: true)
}
/// Increments The Total Detection Time Before The ARReference Object Is Detected
@objc func logDetectionTime(){
detectionTime += 1
}
然后,当检测到ARReferenceObject
时,计时器无效并记录时间,例如:
//--------------------------
// MARK: - ARSCNViewDelegate
//--------------------------
extension ViewController: ARSCNViewDelegate{
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
//1. Check We Have A Valid ARObject Anchor
guard let _ = anchor as? ARObjectAnchor else { return }
//2. Stop The Timer
detectionTimer.invalidate()
//3. Log The Detection Time
print("Total Detection Time = \(detectionTime) Seconds")
//4. Reset The Detection Time
detectionTime = 0
}
}
这应该足以让您开始...
请注意,此示例在扫描对象时不提供boundingBox(请参阅Apple示例代码),它基于问题中所隐含的现有ARReferenceObject提供了一个boundingBox(假设我的解释正确)。