我想对ARKit会话期间在单个帧中找到的每个检测到的特征点进行一些处理。如何迭代每个检测到的特征点,并获得它们的世界坐标?
我使用Swift,但Objective-C答案也会很好。
答案 0 :(得分:7)
编辑:在Xcode 9.0 GM(及更高版本)中,points
是float3
向量的Swift数组,因此您可以像任何其他Swift数组一样迭代它:< / p>
for point in frame.rawFeaturePoints.points {
...
}
或者:
frame.rawFeaturePoints.points.map { point in
...
}
或者您最喜欢的阵列/收集算法。
在各种Xcode 9.x测试版中,此属性的Swift Array版本不可用。因此,您必须处理underlying ObjC property,它导入Swift为UnsafePointer
,您无法轻易迭代。 (因此OP的原始问题。)
如果该错误重新出现(或者您在其他地方遇到类似的问题),您可以这样做:
for index in 0..<frame.rawFeaturePoints.count {
let point = frame.rawFeaturePoints.points[index]
// do something with point
}
答案 1 :(得分:2)
由于我花了一段时间才得到一个工作样本,看看如何获得特征点的渲染,这里有一个完整的工作示例:
SCNNode *ARPointCloudNode = nil;
- (void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame API_AVAILABLE(ios(11.0));
{
NSInteger pointCount = frame.rawFeaturePoints.count;
if (pointCount) {
// We want a root node to work in, it's going to hold the all of the represented spheres
// that come together to make the point cloud
if (!ARPointCloudNode) {
ARPointCloudNode = [SCNNode node];
[self.sceneView.scene.rootNode addChildNode:ARPointCloudNode];
}
// It's going to need some colour
SCNMaterial *whiteMaterial = [SCNMaterial material];
whiteMaterial.diffuse.contents = [UIColor whiteColor];
whiteMaterial.locksAmbientWithDiffuse = YES;
// Remove the old point clouds (this happens per-frame
for (SCNNode *child in ARPointCloudNode.childNodes) {
[child removeFromParentNode];
}
// Use the frames point cloud to create a set of SCNSpheres
// which live at the feature point in the AR world
for (NSInteger i = 0; i < pointCount; i++) {
vector_float3 point = frame.rawFeaturePoints.points[i];
SCNVector3 vector = SCNVector3Make(point[0], point[1], point[2]);
SCNSphere *pointSphere = [SCNSphere sphereWithRadius:0.001];
pointSphere.materials = @[whiteMaterial];
SCNNode *pointNode = [SCNNode nodeWithGeometry:pointSphere];
pointNode.position = vector;
[ARPointCloudNode addChildNode:pointNode];
}
}
}