我试图接受AVCaptureSession并编码为mp4。看起来这应该是直截了当的,我试图编码单个960x540视频流;对于这个问题,我并不担心音频。
当我运行以下代码并使用Xcode从文档容器中取出out2.mp4
时,我会在快速时间内获得黑屏,持续时间为46小时。至少分辨率看起来正确。这是ffmpeg -i out2.mp4
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'out2.mp4':
Metadata:
major_brand : mp42
minor_version : 1
compatible_brands: mp41mp42isom
creation_time : 2015-11-18 01:25:55
Duration: 46:43:04.21, start: 168178.671667, bitrate: 0 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt709/bt709), 960x540, 1860 kb/s, 27.65 fps, 29.97 tbr, 600 tbn, 1200 tbc (default)
Metadata:
creation_time : 2015-11-18 01:25:55
handler_name : Core Media Video
为什么我不能在此方案中将样本缓冲区附加到AVAssetWriterInput
?
var videoInput: AVAssetWriterInput?
var assetWriter: AVAssetWriter?
override func viewDidLoad() {
super.viewDidLoad()
self.startStream()
NSTimer.scheduledTimerWithTimeInterval(5, target: self, selector: "swapSegment", userInfo: nil, repeats: false)
}
func swapSegment() {
assetWriter?.finishWritingWithCompletionHandler(){
print("File written")
}
videoInput = nil
}
func pathForOutput() -> String {
let urls = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
if let documentDirectory: NSURL = urls.first {
let fileUrl = documentDirectory.URLByAppendingPathComponent("out1.mp4")
return fileUrl.path!
}
return ""
}
func startStream() {
assetWriter = try! AVAssetWriter(URL: NSURL(fileURLWithPath: self.pathForOutput()), fileType: AVFileTypeMPEG4)
let videoSettings: [String: AnyObject] = [AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: 960, AVVideoHeightKey: 540]
videoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings)
videoInput!.expectsMediaDataInRealTime = true
assetWriter?.addInput(videoInput!)
assetWriter!.startWriting()
assetWriter!.startSessionAtSourceTime(kCMTimeZero)
let videoHelper = VideoHelper()
videoHelper.delegate = self
videoHelper.startSession()
}
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef, fromConnection connection: AVCaptureConnection!) {
if let videoOutput = captureOutput as? AVCaptureVideoDataOutput {
videoInput?.appendSampleBuffer(sampleBuffer)
}
}
答案 0 :(得分:5)
也许您的演示时间与您的sourceTime(kCMTimeZero
)无关。您可以使用第一个缓冲区演示时间戳作为源时间。
P.S。也许46小时大约是你的设备正常运行时间