iOS Xamarin光检测

时间:2017-04-11 19:42:19

标签: c# ios xamarin.ios

我正在使用C#在Xamarin(iOS)上开发一个应用程序,试图检测手机手电筒在大约10英尺外开启和关闭。我已经浏览了Xamarin的开发网站,但似乎没有任何关于如何检查相机是否检测到实时视频输入是否看到灯打开和关闭的问题。

我对图像处理非常陌生,我现在的思考过程如下:

  • 开始录制视频,然后返回并尝试访问帧缓冲区并检查每帧以查看帧/图像的某个区域是否具有比帧/图像的其他部分更高的光温度。如果一个集中区域的温度较高,则表示手电筒已打开,否则不是

  • 我不确定在没有实际录制视频的情况下是否可以这样做。如果我只能通过实时视频输入检查光温

  • ,那将是理想的选择

有关使用哪些类/ API的任何指导都将提供很大帮助。谢谢!

更新:到目前为止,这是我们的代码。我们已经设置了实时摄像机流,但我们老实说只是不知道要使用什么类或使用什么方向来开始访问整个视频中的各个图像以及各个图像的亮度值。请帮助

using Foundation;
using System;
using UIKit;

using AVFoundation;
using CoreVideo;
using CoreGraphics;
using System.Threading.Tasks;
using CoreMedia;
using ImageIO;

namespace VisibleLightData
{
    public partial class ReceiveViewController : UIViewController
    {
        AVCaptureSession captureSession;
        AVCaptureDeviceInput captureDeviceInput;
        AVCaptureVideoPreviewLayer videoPreviewLayer;

        public ReceiveViewController(IntPtr handle) : base(handle)
        {
        }

        public override void ViewDidLoad()
        {
            base.ViewDidLoad();
            SetupLiveCameraStream();
        }

        partial void ShowMain(UIButton sender)
        {
            //Get reference to current storyboar
            UIStoryboard storyboard = this.Storyboard;

            //Create instance of ReceiveViewController
            ViewController viewController = (ViewController)storyboard.InstantiateViewController("ViewController");

            //Display ReceiveViewController
            PresentViewController(viewController, true, null);
        }

        public void SetupLiveCameraStream()
        {
            captureSession = new AVCaptureSession();

            var viewLayer = recLiveCameraStream.Layer;
            videoPreviewLayer = new AVCaptureVideoPreviewLayer(captureSession) { Frame = this.View.Frame };

            recLiveCameraStream.Layer.AddSublayer(videoPreviewLayer);

            var captureDevice = AVCaptureDevice.GetDefaultDevice(AVMediaTypes.Video);
            ConfigureCameraForDevice(captureDevice);
            captureDeviceInput = AVCaptureDeviceInput.FromDevice(captureDevice);

            captureSession.AddInput(captureDeviceInput);

            captureSession.StartRunning();
        }

        void ConfigureCameraForDevice(AVCaptureDevice device)
        {
            var error = new NSError();
            if (device.IsFocusModeSupported(AVCaptureFocusMode.ContinuousAutoFocus))
            {
                device.LockForConfiguration(out error);
                device.FocusMode = AVCaptureFocusMode.ContinuousAutoFocus;
                device.UnlockForConfiguration();
            }
            else if (device.IsExposureModeSupported(AVCaptureExposureMode.ContinuousAutoExposure))
            {
                device.LockForConfiguration(out error);
                device.ExposureMode = AVCaptureExposureMode.ContinuousAutoExposure;
                device.UnlockForConfiguration();
            }
            else if (device.IsWhiteBalanceModeSupported(AVCaptureWhiteBalanceMode.ContinuousAutoWhiteBalance))
            {
                device.LockForConfiguration(out error);
                device.WhiteBalanceMode = AVCaptureWhiteBalanceMode.ContinuousAutoWhiteBalance;
                device.UnlockForConfiguration();
            }
        }
    }
}

0 个答案:

没有答案