我正在使用swift 4并有一个应用程序,人们可以通过我的应用程序打开他们的手机相机。我有一个名为CameraController
的ViewController,它有默认的UIView
,我有一个名为CameraView
的视图,它显示用户相机和其他按钮。
当我点击其中一个按钮时,它会通过segue(PlacesController
)将我带到另一个视图控制器。当我关闭PlacesController
时,我会回到CameraController
,但子视图现在需要大约8或10秒才能再次显示。
在维持我当前的子视图的同时,我是否可以访问另一个控制器?
同样问题是,当我转到我的segue控制器PlaceController
然后返回我的CameraController
时,相机和子层变得可见之前需要大约8或10秒。特别是下面这段代码,我想知道我是否可以保持我的子层仍在运行,因为它等待10秒才能显示太多。
self.CameraView.layer.insertSublayer(previewLayer!, at: 0)
这是我的代码:
class CameraController: UIViewController {
@IBOutlet weak var CameraView: UIView!
var previewLayer: AVCaptureVideoPreviewLayer?
let captureSession = AVCaptureSession()
override func viewDidLoad() {
super.viewDidLoad()
}
override func viewDidAppear(_ animated: Bool) {
DispatchQueue.main.async {
self.beginSession()
}
func beginSession() {
// gets the camera showing and displays buttons on top of it
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
self.CameraView.layer.insertSublayer(previewLayer!, at: 0)
previewLayer?.frame = self.CameraView.layer.bounds
previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
captureSession.startRunning()
stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecType.jpeg]
if captureSession.canAddOutput(stillImageOutput) {
captureSession.addOutput(stillImageOutput)
}
}
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
if segue.identifier == "PlacesController" {
let PlaceAreaSearchC: ColorC = segue.destination as! PlacesController
PlaceAreaSearchC.delegate = self
}
}
// PlacesController
class PlacesController: UIViewController {
@IBAction func backAction(_ sender: Any) {
// This is how I go back to my view CameraController
dismiss(animated: true, completion: nil)
}
}
答案 0 :(得分:16)
AVCaptureSession
startRunning
调用会阻止您的主线程,从而导致延迟。
正如startRunning()
’s Apple Doc中所述:
startRunning()方法是一个阻塞调用,可能需要一些时间, 因此,您应该在串行队列上执行会话设置 主队列没有被阻止(这使得UI保持响应)。
答案 1 :(得分:6)
除了 Lyndsey Scott 的answer:
let backCamera: AVCaptureDevice? = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInWideAngleCamera, for: AVMediaType.video, position: AVCaptureDevice.Position.back)
guard let camera = backCamera, let deviceInput = try? AVCaptureDeviceInput(device: camera) else {
self?.didReceive(captureSession: nil)
return
}
DispatchQueue.global(qos: .userInitiated).async { [weak self] in
let deviceOutput = AVCapturePhotoOutput()
let cameraSession = AVCaptureSession()
cameraSession.beginConfiguration()
cameraSession.sessionPreset = .low
if cameraSession.canAddInput(deviceInput) {
cameraSession.addInput(deviceInput)
}
if cameraSession.canAddOutput(deviceOutput) {
cameraSession.addOutput(deviceOutput)
}
cameraSession.commitConfiguration()
cameraSession.startRunning()
DispatchQueue.main.async {
self?.didReceive(captureSession: cameraSession)
}
}
Xcode 10.0,Swift 4.2