使用MonoTouch在iOS中进行视频捕获

时间:2011-05-10 16:42:47

标签: ios delegates xamarin.ios queue video-capture

我有代码在Objective-C中创建,配置和启动视频捕获会话而没有任何问题。我将示例移植到C#和MonoTouch 4.0.3并遇到一些问题,这里是代码:

    void Initialize ()
    {   
        // Create notifier delegate class 
        captureVideoDelegate = new CaptureVideoDelegate(this);

        // Create capture session
        captureSession = new AVCaptureSession();
        captureSession.SessionPreset = AVCaptureSession.Preset640x480;

        // Create capture device
        captureDevice = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);

        // Create capture device input
        NSError error;
        captureDeviceInput = new AVCaptureDeviceInput(captureDevice, out error);
        captureSession.AddInput(captureDeviceInput);

        // Create capture device output
        captureVideoOutput = new AVCaptureVideoDataOutput();
        captureSession.AddOutput(captureVideoOutput);
        captureVideoOutput.VideoSettings.PixelFormat = CVPixelFormatType.CV32BGRA;
        captureVideoOutput.MinFrameDuration = new CMTime(1, 30);
        //
        // ISSUE 1
        // In the original Objective-C code I was creating a dispatch_queue_t object, passing it to
        // setSampleBufferDelegate:queue message and worked, here I could not find an equivalent to 
        // the queue mechanism. Also not sure if the delegate should be used like this).
        //
        captureVideoOutput.SetSampleBufferDelegatequeue(captureVideoDelegate, ???????);

        // Create preview layer
        previewLayer = AVCaptureVideoPreviewLayer.FromSession(captureSession);
        previewLayer.Orientation = AVCaptureVideoOrientation.LandscapeRight;
        //
        // ISSUE 2:
        // Didn't find any VideoGravity related enumeration in MonoTouch (not sure if string will work)
        //
        previewLayer.VideoGravity = "AVLayerVideoGravityResizeAspectFill";
        previewLayer.Frame = new RectangleF(0, 0, 1024, 768);
        this.View.Layer.AddSublayer(previewLayer);

        // Start capture session
        captureSession.StartRunning();

    }

    #endregion

    public class CaptureVideoDelegate : AVCaptureVideoDataOutputSampleBufferDelegate
    {
        private VirtualDeckViewController mainViewController;

        public CaptureVideoDelegate(VirtualDeckViewController viewController)
        {
            mainViewController = viewController;
        }

        public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
        {
            // TODO: Implement - see: http://go-mono.com/docs/index.aspx?link=T%3aMonoTouch.Foundation.ModelAttribute

        }
    }

问题1: 不确定如何正确使用SetSampleBufferDelegatequeue方法中的委托。也找不到与dispatch_queue_t对象相同的机制,它在Objective-C中正常工作以传入第二个参数。

问题2: 我没有在MonoTouch库中找到任何VideoGravity枚举,不确定传递具有常量值的字符串是否有效。

我已经找到了解决这个问题的任何线索,但没有明确的样本。任何有关如何在MonoTouch中执行此操作的样本或信息都将受到高度赞赏。

非常感谢。

2 个答案:

答案 0 :(得分:1)

这是我的代码。好好利用它。我只是删除了重要的东西,所有的初始化都在那里,以及读取样本输出缓冲区。

然后我有代码处理CVImageBuffer形式一个链接的自定义ObjC库,如果你需要在Monotouch中处理它,那么你需要加倍努力并将其转换为CGImage或UIImage。在Monotouch(AFAIK)中没有这个功能,所以你需要自己绑定它,从普通的ObjC。 ObjC中的示例如下:how to convert a CVImageBufferRef to UIImage

public void InitCapture ()
        {
            try
            {
                // Setup the input
                NSError error = new NSError ();
                captureInput = new AVCaptureDeviceInput (AVCaptureDevice.DefaultDeviceWithMediaType (AVMediaType.Video), out error); 

                // Setup the output
                captureOutput = new AVCaptureVideoDataOutput (); 
                captureOutput.AlwaysDiscardsLateVideoFrames = true; 
                captureOutput.SetSampleBufferDelegateAndQueue (avBufferDelegate, dispatchQueue);
                captureOutput.MinFrameDuration = new CMTime (1, 10);

                // Set the video output to store frame in BGRA (compatible across devices)
                captureOutput.VideoSettings = new AVVideoSettings (CVPixelFormatType.CV32BGRA);

                // Create a capture session
                captureSession = new AVCaptureSession ();
                captureSession.SessionPreset = AVCaptureSession.PresetMedium;
                captureSession.AddInput (captureInput);
                captureSession.AddOutput (captureOutput);

                // Setup the preview layer
                prevLayer = new AVCaptureVideoPreviewLayer (captureSession);
                prevLayer.Frame = liveView.Bounds;
                prevLayer.VideoGravity = "AVLayerVideoGravityResize"; // image may be slightly distorted, but red bar position will be accurate

                liveView.Layer.AddSublayer (prevLayer);

                StartLiveDecoding ();
            }
            catch (Exception ex)
            {
                Console.WriteLine (ex.ToString ());
            }
        }

public void DidOutputSampleBuffer (AVCaptureOutput captureOutput, MonoTouch.CoreMedia.CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
        {   
            Console.WriteLine ("DidOutputSampleBuffer: enter");

            if (isScanning) 
            {
                CVImageBuffer imageBuffer = sampleBuffer.GetImageBuffer (); 

                Console.WriteLine ("DidOutputSampleBuffer: calling decode");

                //      NSLog(@"got image w=%d h=%d bpr=%d",CVPixelBufferGetWidth(imageBuffer), CVPixelBufferGetHeight(imageBuffer), CVPixelBufferGetBytesPerRow(imageBuffer));
                // call the decoder
                DecodeImage (imageBuffer);
            }
            else
            {
                Console.WriteLine ("DidOutputSampleBuffer: not scanning");
            }

            Console.WriteLine ("DidOutputSampleBuffer: quit");
        } 

答案 1 :(得分:1)

所有问题都解决了,最后工作正常,冻结正在发生,因为在我的测试中我还没有在方法DidOutputSampleBuffer中处理sampleBuffer。我的观点的最终代码在这里:

更新1 :更改了VideoSettings CVPixelFormat的分配,这是错误的,会导致sampleBuffer中出现错误的BytesPerPixel。

public partial class VirtualDeckViewController : UIViewController
{   
    public CaptureVideoDelegate captureVideoDelegate;

    public AVCaptureVideoPreviewLayer previewLayer;
    public AVCaptureSession captureSession;
    public AVCaptureDevice captureDevice;
    public AVCaptureDeviceInput captureDeviceInput;
    public AVCaptureVideoDataOutput captureVideoOutput;

...

    public override void ViewDidLoad ()
    {
        base.ViewDidLoad ();

        SetupVideoCaptureSession();
    }

    public void SetupVideoCaptureSession()
    {
        // Create notifier delegate class 
        captureVideoDelegate = new CaptureVideoDelegate();

        // Create capture session
        captureSession = new AVCaptureSession();
        captureSession.BeginConfiguration();
        captureSession.SessionPreset = AVCaptureSession.Preset640x480;

        // Create capture device
        captureDevice = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);

        // Create capture device input
        NSError error;
        captureDeviceInput = new AVCaptureDeviceInput(captureDevice, out error);
        captureSession.AddInput(captureDeviceInput);

        // Create capture device output
        captureVideoOutput = new AVCaptureVideoDataOutput();
        captureVideoOutput.AlwaysDiscardsLateVideoFrames = true;
                    // UPDATE: Wrong videosettings assignment
        //captureVideoOutput.VideoSettings.PixelFormat = CVPixelFormatType.CV32BGRA;
                    // UPDATE Correct videosettings assignment
                    captureVideoOutput.VideoSettings = new AVVideoSettings(CVPixelFormatType.CV32BGRA);
        captureVideoOutput.MinFrameDuration = new CMTime(1, 30);
        DispatchQueue dispatchQueue = new DispatchQueue("VideoCaptureQueue");
        captureVideoOutput.SetSampleBufferDelegateAndQueue(captureVideoDelegate, dispatchQueue);
        captureSession.AddOutput(captureVideoOutput);

        // Create preview layer
        previewLayer = AVCaptureVideoPreviewLayer.FromSession(captureSession);
        previewLayer.Orientation = AVCaptureVideoOrientation.LandscapeLeft;
        previewLayer.VideoGravity = "AVLayerVideoGravityResizeAspectFill";
        previewLayer.Frame = new RectangleF(0, 0, 1024, 768);
        this.View.Layer.AddSublayer(previewLayer);

        // Start capture session
        captureSession.CommitConfiguration();
        captureSession.StartRunning();  
    }

    public class CaptureVideoDelegate : AVCaptureVideoDataOutputSampleBufferDelegate
    {   
        public CaptureVideoDelegate() : base()
        {   
        }

        public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
        {
            // TODO: Implement buffer processing

            // Very important (buffer needs to be disposed or it will freeze)
            sampleBuffer.Dispose();
        }
    }

我最终在这里找到的Miguel de Icaza样本回答了最后一块拼图:link

感谢Miguel和Pavel