我使用了后续代码对包含多个本地图片的视频进行编码。但问题是我有30张图片,而且只能获得1秒的视频,有没有办法以30秒和24帧速率获取视频?
- (BOOL)encodeReadySamplesFromOutput:(AVAssetReaderOutput *)output toInput:(AVAssetWriterInput *)input
{
NSLog(@"Frame init m == %d",m);
while (input.isReadyForMoreMediaData)
{
CMSampleBufferRef sampleBuffer = [output copyNextSampleBuffer];
if (sampleBuffer)
{
BOOL handled = NO;
BOOL error = NO;
CMItemCount count;
CMSampleBufferGetSampleTimingInfoArray(sampleBuffer, 0, nil, &count);
CMSampleTimingInfo *timingInfo = malloc(sizeof(CMSampleTimingInfo) * count);
CMSampleBufferGetSampleTimingInfoArray(sampleBuffer, count, timingInfo, &count);
for (CMItemCount i = 0; i < count; i++)
{
timingInfo[i].decodeTimeStamp = kCMTimeInvalid;
timingInfo[i].presentationTimeStamp = CMTimeMake(m, 24);
// timingInfo[i].duration = CMTimeMake(1, 12);
}
CMSampleBufferRef completedSampleBuffer;
CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, sampleBuffer, count, timingInfo, &completedSampleBuffer);
free(timingInfo);
if (self.reader.status != AVAssetReaderStatusReading || self.writer.status != AVAssetWriterStatusWriting)
{
handled = YES;
error = YES;
}
if (!handled && self.videoOutput == output)
{
// update the video progress
++m;
NSLog(@"Frame m == %d",m);
lastSamplePresentationTime = CMSampleBufferGetPresentationTimeStamp(completedSampleBuffer);
CMTimeValue value = lastSamplePresentationTime.value;
CMTimeScale scale = lastSamplePresentationTime.timescale;
NSLog(@"Frame value == %lld", value);
NSLog(@"Frame scale == %d",scale);
self.progress = duration == 0 ? 1 : CMTimeGetSeconds(lastSamplePresentationTime) / duration;
if ([self.delegate respondsToSelector:@selector(exportSession:renderFrame:withPresentationTime:toBuffer:)])
{
CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(completedSampleBuffer);
CVPixelBufferRef renderBuffer = NULL;
CVPixelBufferPoolCreatePixelBuffer(NULL, self.videoPixelBufferAdaptor.pixelBufferPool, &renderBuffer);
[self.delegate exportSession:self renderFrame:pixelBuffer withPresentationTime:lastSamplePresentationTime toBuffer:renderBuffer];
if (![self.videoPixelBufferAdaptor appendPixelBuffer:renderBuffer withPresentationTime:lastSamplePresentationTime])
{
error = YES;
}
CVPixelBufferRelease(renderBuffer);
handled = YES;
}
}
if (!handled && ![input appendSampleBuffer:completedSampleBuffer])
{
error = YES;
}
CFRelease(sampleBuffer);
CFRelease(completedSampleBuffer);
if (error)
{
return NO;
}
}
else
{
[input markAsFinished];
return NO;
}
}
return YES;
}
答案 0 :(得分:1)
除非你得到更多的照片或重复你的照片,否则不会。
在任何一种情况下,您都必须自己计算演示时间,例如CMTimeMake(m, 24)
,例如:
[self.videoPixelBufferAdaptor appendPixelBuffer:renderBuffer withPresentationTime:CMTimeMake(m, 24)];
如果您放弃了24fps的要求(为什么还需要?),您可以使用CMTimeMake(m, 1)
代替appendPixelBuffer:withPresentationTime:
,以1fps的速度获得30张30秒的视频。