iPad Pro Lidar-导出几何形状和纹理

时间:2020-05-01 08:00:21

标签: arkit lidar

我希望能够从iPad Pro Lidar导出网格和纹理。

这里有一些有关如何导出网格物体的示例,但我也希望能够导出环境纹理

ARKit 3.5 – How to export OBJ from new iPad Pro with LiDAR?

ARMeshGeometry存储网格的顶点,是否会因为人们在扫描环境并手动应用纹理时不得不“记录”纹理?

这篇文章似乎显示了一种获取纹理坐标的方法,但是我看不到使用ARMeshGeometry来做到这一点的方法:Save ARFaceGeometry to OBJ file

任何指向正确方向的东西,或者值得一看的东西,将不胜感激!

克里斯

1 个答案:

答案 0 :(得分:2)

您需要计算每个顶点的纹理坐标,将其应用于网格,然后将纹理作为材料提供给网格。

let geom = meshAnchor.geometry
let vertices = geom.vertices 
let size = arFrame.camera.imageResolution
let camera = arFrame.camera

let modelMatrix = meshAnchor.transform

let textureCoordinates = vertices.map { vertex -> vector_float2 in
    let vertex4 = vector_float4(vertex.x, vertex.y, vertex.z, 1)
    let world_vertex4 = simd_mul(modelMatrix!, vertex4)
    let world_vector3 = simd_float3(x: world_vertex4.x, y: world_vertex4.y, z: world_vertex4.z)
    let pt = camera.projectPoint(world_vector3,
        orientation: .portrait,
        viewportSize: CGSize(
            width: CGFloat(size.height),
            height: CGFloat(size.width)))
    let v = 1.0 - Float(pt.x) / Float(size.height)
    let u = Float(pt.y) / Float(size.width)
    return vector_float2(u, v)
}

// construct your vertices, normals and faces from the source geometry directly and supply the computed texture coords to create new geometry and then apply the texture.

let scnGeometry = SCNGeometry(sources: [verticesSource, textureCoordinates, normalsSource], elements: [facesSource])

let texture = UIImage(pixelBuffer: frame.capturedImage)
let imageMaterial = SCNMaterial()
imageMaterial.isDoubleSided = false
imageMaterial.diffuse.contents = texture
scnGeometry.materials = [imageMaterial]
let pcNode = SCNNode(geometry: scnGeometry)

pcNode(如果添加到场景中)将包含已应用纹理的网格。

根据here

计算纹理坐标