使用GPUImage过滤视频

时间:2014-03-24 21:32:35

标签: ios objective-c gpuimage avassetwriter

我在我的应用程序中使用GPUImage并尝试过滤视频。实时视频过滤效果很好。当我尝试从文件系统将视频读入内存并使用sunsetlakessoftware教程页面和SimpleVideoFileFilter演示中发布的代码应用过滤器时,会出现问题。

编辑:我意识到我原来的帖子可能没有提出足够具体的问题。我要问的是:我如何才能将视频从磁盘读入内存,应用GPUImageFilter,然后用过滤后的版本覆盖原始文件?

应用程序崩溃时出现以下错误:

-[AVAssetWriter startWriting] Cannot call method when status is 2

状态2为AVAssetWriterStatusCompleted。我已经看到同样的失败发生在所有其他三个AVAssetWriterStatus es。

我已在下面发布了相关代码。

GPUImageFilter *selectedFilter = [self.allFilters objectAtIndex:indexPath.item];

// get the file url I stored when the video was initially captured
NSURL *url = [self.videoURLsByIndexPath objectForKey:self.indexPathForDisplayedImage];

GPUImageMovie *movieFile = [[GPUImageMovie alloc] initWithURL:url];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = NO;
[movieFile addTarget:selectedFilter]; // apply the user-selected filter to the file

unlink([url.absoluteString UTF8String]); // delete the file that was at that file URL so it's writeable

// A different movie writer than the one I was using for live video capture.
GPUImageMovieWriter *editingMovieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:url size:CGSizeMake(640.0, 640.0)];

[selectedFilter addTarget:editingMovieWriter];

editingMovieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = editingMovieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:editingMovieWriter];

[editingMovieWriter startRecording];
[movieFile startProcessing]; // Commenting out this line prevents crash

// weak variables to prevent retain cycle
__weak GPUImageMovieWriter *weakWriter = editingMovieWriter;
__weak id weakSelf = self;
[editingMovieWriter setCompletionBlock:^{
    [selectedFilter removeTarget:weakWriter];
    [weakWriter finishRecording];
    [weakSelf savePhotosToLibrary]; // use ALAssetsLibrary to write to camera roll
}];

也许我的问题在于editMovieWriter的范围。或者我正在尝试使用与我尝试写入的URL相同的初始化GPUImageMovie实例。我已经阅读了GPUImage github的问题页面上的几篇帖子,SO上的几个相关帖子,自述文件以及上面链接的教程。

对此问题的任何见解将不胜感激。谢谢。

2 个答案:

答案 0 :(得分:1)

这里可能至少有一件事可能背后。在上面的代码中,您并未依赖于对movieFile源对象的强引用。

如果这是一个启用ARC的项目,那么在您完成设置方法的那一刻,该对象将被取消分配(如果不是,那么您将泄漏该对象)。这将停止电影播放,取消分配电影本身,并导致黑色帧沿过滤器管道发送(其他潜在的不稳定性)。

您需要使movieFile成为一个强引用的实例变量,以确保它在此设置方法之后挂起,因为所有电影处理都是异步的。

答案 1 :(得分:0)

Here is solution : 

Declare it 
    var movieFile: GPUImageMovie!
    var gpuImage: GPUImagePicture!
    var sourcePicture: GPUImagePicture!
    var sepiaFilter: GPUImageOutput!
    var sepiaFilter2: GPUImageInput! 
    var  movieWriter : GPUImageMovieWriter!
    var filter: GPUImageInput!

//Filter image 

  func StartWriting()
    {

        // Step - 1  pass url to  avasset
        let loadingNotification = MBProgressHUD.showHUDAddedTo(self.view, animated: true)
        loadingNotification.mode = MBProgressHUDMode.Indeterminate
        loadingNotification.labelText = "Loading"

        let documentsURL1 = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as! NSURL
        let pathToMovie = documentsURL1.URLByAppendingPathComponent("temp.mov")
        self.movieFile = GPUImageMovie(URL: pathToMovie)
        self.movieFile.runBenchmark = true
        self.movieFile.playAtActualSpeed = false
        self.filter = GPUImageGrayscaleFilter()
        self.sepiaFilter = GPUImageGrayscaleFilter()
        self.movieFile.addTarget(self.filter)
        let documentsURL2 = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as! NSURL
        self.paths = documentsURL2.URLByAppendingPathComponent("temp1.mov")
        var fileManager: NSFileManager = NSFileManager.defaultManager()
        var error: NSError
        fileManager.removeItemAtURL(self.paths, error: nil)
        let Data = NSData(contentsOfURL: pathToMovie)
        println( Data?.length )

        var anAsset = AVAsset.assetWithURL(pathToMovie)as!AVAsset

        var videoAssetTrack = anAsset.tracksWithMediaType(AVMediaTypeVideo)[0]as! AVAssetTrack
        var videoAssetOrientation_: UIImageOrientation = .Up
        var isVideoAssetPortrait_: Bool = true
        var videoTransform: CGAffineTransform = videoAssetTrack.preferredTransform

        var naturalSize = CGSize()
        var FirstAssetScaleToFitRatio: CGFloat = 320.0 / videoAssetTrack.naturalSize.width
        println(naturalSize)
        naturalSize = videoAssetTrack.naturalSize
        self.movieWriter = GPUImageMovieWriter(movieURL: self.paths, size: naturalSize)
        let input = self.filter as! GPUImageOutput
        input.addTarget(self.movieWriter)
        self.movieWriter.shouldPassthroughAudio = true
        if anAsset.tracksWithMediaType(AVMediaTypeAudio).count > 0 {
            self.movieFile.audioEncodingTarget =  self.movieWriter
        }
        else
        {
            self.movieFile.audioEncodingTarget = nil
        }

        self.movieFile.enableSynchronizedEncodingUsingMovieWriter(self.movieWriter)
        self.movieWriter.startRecording()
        self.movieFile.startProcessing()

        self.movieWriter.completionBlock =

            {() -> Void in

                self.movieWriter.finishRecording()


                self.obj.performWithAsset(self.paths)



        }
        let delayTime1 = dispatch_time(DISPATCH_TIME_NOW, Int64(15 * Double(NSEC_PER_SEC)))
        dispatch_after(delayTime1, dispatch_get_main_queue()) {
            MBProgressHUD.hideAllHUDsForView(self.view, animated: true)


        }
        hasoutput = true ;
    }