AVAssetExportSession - 无法合成视频

时间:2014-11-22 21:22:34

标签: ios xamarin.ios avassetexportsession

我正在尝试在Xamarin / Monotouch中做一些基本的视频合成,并且取得了一些成功,但我看起来似乎是一个相当简单的任务。

我以纵向录制来自相机的视频,因此我使用AVAssetExportSession旋转视频。我创建了一个图层指令来旋转视频,工作正常。我能够以正确的方向成功导出视频。

问题:

当我将音频轨道添加到导出中时,我总是会收到一个失败的响应:

  

Domain = AVFoundationErrorDomain Code = -11841“Operation Stopped”UserInfo = 0x1912c320 {NSLocalizedDescription = Operation Stopped,NSLocalizedFailureReason =视频无法合成。}

如果我没有在exportSession上设置videoComposition属性,则音频和视频导出完全正确,方向错误。如果有人可以给mem一些建议,将不胜感激。以下是我的代码:

var composition = new AVMutableComposition();
                var compositionTrackAudio = composition.AddMutableTrack(AVMediaType.Audio, 0);
                var compositionTrackVideo = composition.AddMutableTrack(AVMediaType.Video, 0);
                var videoCompositionInstructions = new AVVideoCompositionInstruction[files.Count];
            var index = 0;
            var renderSize = new SizeF(480, 480);
            var _startTime = CMTime.Zero;
            //AVUrlAsset asset;



            var asset = new AVUrlAsset(new NSUrl(file, false), new AVUrlAssetOptions());
            //var asset = AVAsset.FromUrl(new NSUrl(file, false));


            //create an avassetrack with our asset
            var videoTrack = asset.TracksWithMediaType(AVMediaType.Video)[0];
            var audioTrack = asset.TracksWithMediaType(AVMediaType.Audio)[0];

            //create a video composition and preset some settings

            NSError error;

            var assetTimeRange = new CMTimeRange { Start = CMTime.Zero, Duration = asset.Duration };

            compositionTrackAudio.InsertTimeRange(new CMTimeRange
            {
                Start = CMTime.Zero,
                Duration = asset.Duration,
            }, audioTrack, _startTime, out error);

            if (error != null) {
                Debug.WriteLine (error.Description);
            }

            compositionTrackVideo.InsertTimeRange(assetTimeRange, videoTrack, _startTime, out error);

            //create a video instruction


            var transformer = new AVMutableVideoCompositionLayerInstruction
            {
                TrackID = videoTrack.TrackID,
            };

            var audioMix = new AVMutableAudioMix ();
            var mixParameters = new AVMutableAudioMixInputParameters{ 
                TrackID = audioTrack.TrackID
            };

            mixParameters.SetVolumeRamp (1.0f, 1.0f, new CMTimeRange {
                Start = CMTime.Zero,
                Duration = asset.Duration
            });


            audioMix.InputParameters = new [] { mixParameters };
            var t1 = CGAffineTransform.MakeTranslation(videoTrack.NaturalSize.Height, 0);
            //Make sure the square is portrait
            var t2 = CGAffineTransform.Rotate(t1, (float)(Math.PI / 2f));
            var finalTransform = t2;

            transformer.SetTransform(finalTransform, CMTime.Zero);
            //add the transformer layer instructions, then add to video composition


            var instruction = new AVMutableVideoCompositionInstruction
            {
                TimeRange = assetTimeRange,
                LayerInstructions = new []{ transformer }
            };
            videoCompositionInstructions[index] = instruction;
            index++;
            _startTime = CMTime.Add(_startTime, asset.Duration);

            var videoComposition = new AVMutableVideoComposition();
            videoComposition.FrameDuration = new CMTime(1 , (int)videoTrack.NominalFrameRate);
            videoComposition.RenderScale = 1;
            videoComposition.Instructions = videoCompositionInstructions;
            videoComposition.RenderSize = renderSize;

            var exportSession = new AVAssetExportSession(composition, AVAssetExportSession.PresetHighestQuality);

            var filePath = _fileSystemManager.TempDirectory + DateTime.UtcNow.Ticks + ".mp4";

            var outputLocation = new NSUrl(filePath, false);

            exportSession.OutputUrl = outputLocation;
            exportSession.OutputFileType = AVFileType.Mpeg4;
            exportSession.VideoComposition = videoComposition;
            exportSession.AudioMix = audioMix;
            exportSession.ShouldOptimizeForNetworkUse = true;
            exportSession.ExportAsynchronously(() =>
            {
                Debug.WriteLine(exportSession.Status);

                switch (exportSession.Status)
                {

                    case AVAssetExportSessionStatus.Failed:
                        {
                            Debug.WriteLine(exportSession.Error.Description);
                            Debug.WriteLine(exportSession.Error.DebugDescription);
                            break;
                        }
                    case AVAssetExportSessionStatus.Completed:
                        {
                            if (File.Exists(filePath))
                            {
                                _uploadService.AddVideoToVideoByteList(File.ReadAllBytes(filePath), ".mp4");
                                Task.Run(async () =>
                                {
                                    await _uploadService.UploadVideo(_videoData);
                                });
                            }
                            break;
                        }
                    case AVAssetExportSessionStatus.Unknown:
                        {
                            break;
                        }
                    case AVAssetExportSessionStatus.Exporting:
                        {
                            break;
                        }
                    case AVAssetExportSessionStatus.Cancelled:
                        {
                            break;
                        }

                }
            });

2 个答案:

答案 0 :(得分:3)

所以这是一个非常愚蠢的错误,因为在视频之前添加了音轨,所以说明必须尝试将变换应用到音轨而不是我的视频轨道。

答案 1 :(得分:0)

我的问题是我忘了设置timeRange,它应该是这样的

let instruction = AVMutableVideoCompositionInstruction()
instruction.layerInstructions = [layer]
instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: videoDuration)

请注意AVMutableVideoCompositionInstruction.timeRange的结束时间必须有效。它与AVAssetExportSession.timeRange

不同
  

从源导出的时间范围。   导出会话的默认时间范围是kCMTimeZero到 kCMTimePositiveInfinity ,这意味着(模块可能限制文件长度)将导出资产的整个持续时间。   您可以使用键值观察来观察此属性。