有没有办法在程序中伪造DirectShow过滤器?

时间:2011-08-10 20:37:23

标签: c# directshow directshow.net

我有一台IP摄像头,它接收一个包含网络图像的字符缓冲区。我无法访问它,直到我在程序中设置它的连接。我正在尝试剖析Windows源代码过滤器代码并且我的速度不是很快,所以我想我是否可以只使用这样的缓冲区并将其转换为可以将引脚连接到AVISplitter或类似于Directshow的东西/.net

(video buffer from IP Cam) -> (???) -> (AVI Splitter) -> (Profit)

更新

我的程序在命名空间中捕获视频,并且我在自己的命名空间中拥有来自GSSF的代码。我将ptr与cam命名空间中的图像传递给GSSF命名空间。这仅发生一次,但图形流从这一个图像流出,并且摄像机从网络流出。有没有办法不断地将缓冲区从cam传递给GSSF,或者我应该以某种方式组合名称空间?我尝试将主摄像头指针发送到GSSF,但它崩溃了,因为它访问指针并被写入。也许如果我抓住一个图像,通过指针,等待抓住一个新的?

*的 更新 *

我收缩了我的代码,我不相信即使我正确地执行命名空间,我现在看看它。

namespace Cam_Controller
{
    static byte[] mainbyte = new byte[1280*720*2];
    static IntPtr main_ptr = new IntPtr();

    //(this function is threaded)
    static void Trial(NPvBuffer mBuffer, NPvDisplayWnd mDisplayWnd, VideoCompression compressor)
    {
        Functions function = new Functions();
        Defines define = new Defines();
        NPvResult operationalResult = new NPvResult();
        VideoCompression mcompressor = new VideoCompression();

        int framecount = 0;
        while (!Stopping && AcquiringImages)
        {
            Mutex lock_video = new Mutex();
            NPvResult result = mDevice.RetrieveNextBuffer(mBuffer, operationalResult);

            if(result.isOK())
            {
                framecount++;
                wer = (int)mDisplayWnd.Display(mBuffer, wer);


                    main_ptr = (IntPtr)mBuffer.GetMarshalledBuffer();


                    Marshal.Copy(main_ptr, mainbyte, 0, 720 * 2560);
             }
        }
    }
    private void button7_Click(object sender, EventArgs e)
    {
        IntPtr dd = (IntPtr)mBuffer.GetMarshalledBuffer();
        Marshal.Copy(dd, main_byte1, 0, 720 * 2560);
        play = new VisiCam_Controller.DxPlay.DxPlay("", panel9, main_byte1);
        play.Start();


    }


    namespace DxPlay
    {
        public class DxPlay
        {
            public DxPlay(string sPath, Control hWin, byte[] color)
            {
                try
                {
                    // pick one of our image providers
                    //m_ImageHandler = new ImageFromFiles(sPath, 24);
                    m_ImageHandler = new ImageFromPixels(20, color);
                    //m_ImageHandler = new ImageFromMpg(@"c:\c1.mpg");
                    //m_ImageHandler = new ImageFromMpg(sPath);
                    //m_ImageHandler = new ImageFromMP3(@"c:\vss\media\track3.mp3");

                // Set up the graph
                    SetupGraph(hWin);
                }
                catch
                {
                    Dispose();
                    throw;
                }
            }
        }
        abstract internal class imagehandler
        internal class imagefrompixels
        {
            private int[] mainint = new int[720 * 1280];
            unsafe public ImageFromPixels(long FPS, byte[] x)
            {
                long fff = 720 * 1280 * 3;
                mainptr = new IntPtr(fff);
                for (int p = 0; p < 720 * 640; p++)
                {
                    U = (x[ p * 4 + 0]);

                    Y = (x[p * 4 + 1]);
                    V = (x[p * 4 + 2]);
                    Y2 = (x[p * 4 + 3]);

                    int one = V << 16 | Y << 8 | U;
                    int two = V << 16 | Y2 << 8 | U;
                    mainint[p * 2 + 0] = one;
                    mainint[p * 2 + 1] = two;

                }

                m_FPS = UNIT / FPS;
                m_b = 211;
                m_g = 197;
            }
        }
    }
}

Theres也是GetImage但是相对相同,将缓冲区复制到指针中。我会抓住图像的缓冲区并将其发送到DxPlay类。它能够处理它并将它放在directshow线上没有问题;但它永远不会更新也不会更新,因为它只是一个缓冲区。如果我发送DxPlay一个IntPtr保存图像缓冲区的地址,代码崩溃访问内存,因为我假设ImageFromPixels代码(现在不存在(更改

(x[p * 4 + #]) 

(IntPtr)((x-passed as an IntPtr).toInt64()+p*4 + #)

))  正在Cam_Controller类编辑它时访问指针的内存。我制作并传递了IntPtrs和新IntPtrs的副本,但它们在转换过程中失败了。

1 个答案:

答案 0 :(得分:5)

如果要在.NET中执行此操作,则需要执行以下步骤:

  1. 使用sample package内Misc / GSSF目录中的DirectShow.NET通用示例源过滤器(GSSF.AX)。源过滤器始终是COM模块,因此您需要使用“RegSvr32 GSSF.ax”进行注册。

  2. 在.NET中实现位图提供程序

  3. 设置图表,并将GSSF的引脚连接到位图提供程序的实现。

  4. 祈祷。

  5. 我在一个项目中使用以下内容,并使其可以重复使用以供将来使用。

    代码(不是最好的,没有完成,但是工作开始)(这需要一个IVideoSource,下面是):

    public class VideoSourceToVideo : IDisposable
    {
        object locker = new object();
    
        public event EventHandler<EventArgs> Starting;
        public event EventHandler<EventArgs> Stopping;
        public event EventHandler<EventArgs> Completed;
    
        /// <summary> graph builder interface. </summary>
        private DirectShowLib.ICaptureGraphBuilder2 captureGraphBuilder = null;
        DirectShowLib.IMediaControl mediaCtrl = null;
        IMediaEvent mediaEvent = null;
        bool stopMediaEventLoop = false;
        Thread mediaEventThread;
    
        /// <summary> Dimensions of the image, calculated once in constructor. </summary>
        private readonly VideoInfoHeader videoInfoHeader;
    
        IVideoSource source;
    
        public VideoSourceToVideo(IVideoSource source, string destFilename, string encoderName)
        {
            try
            {
                this.source = source;
    
                // Set up the capture graph
                SetupGraph(destFilename, encoderName);
            }
            catch
            {
                Dispose();
                throw;
            }
        }
    
    
        /// <summary> release everything. </summary>
        public void Dispose()
        {
            StopMediaEventLoop();
            CloseInterfaces();
        }
    
        /// <summary> build the capture graph for grabber. </summary>
        private void SetupGraph(string destFilename, string encoderName)
        {
            int hr;
    
            // Get the graphbuilder object
            captureGraphBuilder = new DirectShowLib.CaptureGraphBuilder2() as DirectShowLib.ICaptureGraphBuilder2;
    
            IFilterGraph2 filterGraph = new DirectShowLib.FilterGraph() as DirectShowLib.IFilterGraph2;
    
            mediaCtrl = filterGraph as DirectShowLib.IMediaControl;
            IMediaFilter mediaFilt = filterGraph as IMediaFilter;
            mediaEvent = filterGraph as IMediaEvent;
    
    
    
            captureGraphBuilder.SetFiltergraph(filterGraph);
    
            IBaseFilter aviMux;
            IFileSinkFilter fileSink = null;
            hr = captureGraphBuilder.SetOutputFileName(MediaSubType.Avi, destFilename, out aviMux, out fileSink);
            DsError.ThrowExceptionForHR(hr);
    
            DirectShowLib.IBaseFilter compressor = DirectShowUtils.GetVideoCompressor(encoderName);
    
            if (compressor == null)
            {
                throw new InvalidCodecException(encoderName);
            }
    
    
            hr = filterGraph.AddFilter(compressor, "compressor");
            DsError.ThrowExceptionForHR(hr);
    
    
            // Our data source
            IBaseFilter source = (IBaseFilter)new GenericSampleSourceFilter();
    
            // Get the pin from the filter so we can configure it
            IPin ipin = DsFindPin.ByDirection(source, PinDirection.Output, 0);
    
            try
            {
                // Configure the pin using the provided BitmapInfo
                ConfigurePusher((IGenericSampleConfig)ipin);
            }
            finally
            {
                Marshal.ReleaseComObject(ipin);
            }
    
            // Add the filter to the graph
            hr = filterGraph.AddFilter(source, "GenericSampleSourceFilter");
            Marshal.ThrowExceptionForHR(hr);
    
    
            hr = filterGraph.AddFilter(source, "source");
            DsError.ThrowExceptionForHR(hr);
    
            hr = captureGraphBuilder.RenderStream(null, null, source, compressor, aviMux);
            DsError.ThrowExceptionForHR(hr);
    
            IMediaPosition mediaPos = filterGraph as IMediaPosition;
    
            hr = mediaCtrl.Run();
            DsError.ThrowExceptionForHR(hr);
        }
    
        private void ConfigurePusher(IGenericSampleConfig ips)
        {
            int hr;
    
            source.SetMediaType(ips);
    
            // Specify the callback routine to call with each sample
            hr = ips.SetBitmapCB(source);
            DsError.ThrowExceptionForHR(hr);
        }
    
    
        private void StartMediaEventLoop()
        {
            mediaEventThread = new Thread(MediaEventLoop)
            {
                Name = "Offscreen Vid Player Medialoop",
                IsBackground = false
            };
    
            mediaEventThread.Start();
        }
    
        private void StopMediaEventLoop()
        {
            stopMediaEventLoop = true;
    
            if (mediaEventThread != null)
            {
                mediaEventThread.Join();
            }
        }
    
        public void MediaEventLoop()
        {
            MediaEventLoop(x => PercentageCompleted = x);
        }
    
        public double PercentageCompleted
        {
            get;
            private set;
        }
    
        // FIXME this needs some work, to be completely in-tune with needs.
        public void MediaEventLoop(Action<double> UpdateProgress)
        {
            mediaEvent.CancelDefaultHandling(EventCode.StateChange);
            //mediaEvent.CancelDefaultHandling(EventCode.Starvation);
    
            while (stopMediaEventLoop == false)
            {
                try
                {
                    EventCode ev;
    
                    IntPtr p1, p2;
                    if (mediaEvent.GetEvent(out ev, out p1, out p2, 0) == 0)
                    {
                        switch (ev)
                        {
                            case EventCode.Complete:
                                Stopping.Fire(this, null);
                                if (UpdateProgress != null)
                                {
                                    UpdateProgress(source.PercentageCompleted);
                                }
                                return;
    
    
                            case EventCode.StateChange:
                                FilterState state = (FilterState)p1.ToInt32();
    
                                if (state == FilterState.Stopped || state == FilterState.Paused)
                                {
                                    Stopping.Fire(this, null);
                                }
                                else if (state == FilterState.Running)
                                {
                                    Starting.Fire(this, null);
                                }
    
                                break;
    
                            // FIXME add abort and stuff, and propagate this.
                        }
    
                        //                        Trace.WriteLine(ev.ToString() + " " + p1.ToInt32());
    
                        mediaEvent.FreeEventParams(ev, p1, p2);
                    }
                    else
                    {
                        if (UpdateProgress != null)
                        {
                            UpdateProgress(source.PercentageCompleted);
                        }
                        // FiXME use AutoResetEvent
                        Thread.Sleep(100);
                    }
                }
                catch (Exception e)
                {
                    Trace.WriteLine("MediaEventLoop: " + e);
                }
            }
        }
    
        /// <summary> Shut down capture </summary>
        private void CloseInterfaces()
        {
            int hr;
    
            try
            {
                if (mediaCtrl != null)
                {
                    // Stop the graph
                    hr = mediaCtrl.Stop();
                    mediaCtrl = null;
                }
            }
            catch (Exception ex)
            {
                Debug.WriteLine(ex);
            }
    
            if (captureGraphBuilder != null)
            {
                Marshal.ReleaseComObject(captureGraphBuilder);
                captureGraphBuilder = null;
            }
    
            GC.Collect();
        }
    
        public void Start()
        {
            StartMediaEventLoop();
        }
    }
    

    IVideoSource:

    public interface IVideoSource : IGenericSampleCB
    {
        double PercentageCompleted { get; }
        int GetImage(int iFrameNumber, IntPtr ip, int iSize, out int iRead);
        void SetMediaType(global::IPerform.Video.Conversion.Interops.IGenericSampleConfig psc);
        int SetTimeStamps(global::DirectShowLib.IMediaSample pSample, int iFrameNumber);
    }
    

    ImageVideoSource(主要来自DirectShow.NET示例):

        // A generic class to support easily changing between my different sources of data.
    
    // Note: You DON'T have to use this class, or anything like it.  The key is the SampleCallback
    // routine.  How/where you get your bitmaps is ENTIRELY up to you.  Having SampleCallback call
    // members of this class was just the approach I used to isolate the data handling.
    public abstract class ImageVideoSource : IDisposable, IVideoSource
    {
        #region Definitions
    
        /// <summary>
        /// 100 ns - used by a number of DS methods
        /// </summary>
        private const long UNIT = 10000000;
    
        #endregion
    
        /// <summary>
        /// Number of callbacks that returned a positive result
        /// </summary>
        private int m_iFrameNumber = 0;
    
        virtual public void Dispose()
        {
        }
    
        public abstract double PercentageCompleted { get; protected set; }
    
        abstract public void SetMediaType(IGenericSampleConfig psc);
        abstract public int GetImage(int iFrameNumber, IntPtr ip, int iSize, out int iRead);
        virtual public int SetTimeStamps(IMediaSample pSample, int iFrameNumber)
        {
            return 0;
        }
    
        /// <summary>
        /// Called by the GenericSampleSourceFilter.  This routine populates the MediaSample.
        /// </summary>
        /// <param name="pSample">Pointer to a sample</param>
        /// <returns>0 = success, 1 = end of stream, negative values for errors</returns>
        virtual public int SampleCallback(IMediaSample pSample)
        {
            int hr;
            IntPtr pData;
    
            try
            {
                // Get the buffer into which we will copy the data
                hr = pSample.GetPointer(out pData);
                if (hr >= 0)
                {
                    // Set TRUE on every sample for uncompressed frames
                    hr = pSample.SetSyncPoint(true);
                    if (hr >= 0)
                    {
                        // Find out the amount of space in the buffer
                        int cbData = pSample.GetSize();
    
                        hr = SetTimeStamps(pSample, m_iFrameNumber);
                        if (hr >= 0)
                        {
                            int iRead;
    
                            // Get copy the data into the sample
                            hr = GetImage(m_iFrameNumber, pData, cbData, out iRead);
                            if (hr == 0) // 1 == End of stream
                            {
                                pSample.SetActualDataLength(iRead);
    
                                // increment the frame number for next time
                                m_iFrameNumber++;
                            }
                        }
                    }
                }
            }
            finally
            {
                // Release our pointer the the media sample.  THIS IS ESSENTIAL!  If
                // you don't do this, the graph will stop after about 2 samples.
                Marshal.ReleaseComObject(pSample);
            }
    
            return hr;
        }
    }
    

    RawVideoSource(DirectShow管道的具体托管源生成器示例):

        internal class RawVideoSource : ImageVideoSource
    { 
        private byte[] buffer;
        private byte[] demosaicBuffer;
        private RawVideoReader reader;
    
        public override double PercentageCompleted
        {
            get;
            protected set;
        }
    
        public RawVideoSource(string sourceFile)
        {
            reader = new RawVideoReader(sourceFile);
        }
    
        override public void SetMediaType(IGenericSampleConfig psc)
        {
            BitmapInfoHeader bmi = new BitmapInfoHeader();
    
            bmi.Size = Marshal.SizeOf(typeof(BitmapInfoHeader));
            bmi.Width = reader.Header.VideoSize.Width;
            bmi.Height = reader.Header.VideoSize.Height;
            bmi.Planes = 1;
            bmi.BitCount = 24;
            bmi.Compression = 0;
            bmi.ImageSize = (bmi.BitCount / 8) * bmi.Width * bmi.Height;
            bmi.XPelsPerMeter = 0;
            bmi.YPelsPerMeter = 0;
            bmi.ClrUsed = 0;
            bmi.ClrImportant = 0;
    
            int hr = psc.SetMediaTypeFromBitmap(bmi, 0);
    
            buffer = new byte[reader.Header.FrameSize];
            demosaicBuffer = new byte[reader.Header.FrameSize * 3];
    
            DsError.ThrowExceptionForHR(hr);
        }
    
        long startFrameTime;
        long endFrameTime;
        unsafe override public int GetImage(int iFrameNumber, IntPtr ip, int iSize, out int iRead)
        {
            int hr = 0;
    
            if (iFrameNumber < reader.Header.NumberOfFrames)
            {
                reader.ReadFrame(buffer, iFrameNumber, out startFrameTime, out endFrameTime);
    
                Demosaic.DemosaicGBGR24Bilinear(buffer, demosaicBuffer, reader.Header.VideoSize);
    
                Marshal.Copy(demosaicBuffer, 0, ip, reader.Header.FrameSize * 3);
    
                PercentageCompleted = ((double)iFrameNumber / reader.Header.NumberOfFrames) * 100.0;
            }
            else
            {
                PercentageCompleted = 100;
    
                hr = 1; // End of stream
            }
    
            iRead = iSize;
    
            return hr;
        }
    
        override public int SetTimeStamps(IMediaSample pSample, int iFrameNumber)
        {
            reader.ReadTimeStamps(iFrameNumber, out startFrameTime, out endFrameTime);
    
            DsLong rtStart = new DsLong(startFrameTime);
            DsLong rtStop = new DsLong(endFrameTime);
    
            int hr = pSample.SetTime(rtStart, rtStop);
    
            return hr;
        }
    }
    

    与GSSF.AX COM的互操作:

    namespace IPerform.Video.Conversion.Interops
    {
        [ComImport, Guid("6F7BCF72-D0C2-4449-BE0E-B12F580D056D")]
        public class GenericSampleSourceFilter
        {
        }
    
        [InterfaceType(ComInterfaceType.InterfaceIsIUnknown),
        Guid("33B9EE57-1067-45fa-B12D-C37517F09FC0")]
        public interface IGenericSampleCB
        {
            [PreserveSig]
            int SampleCallback(IMediaSample pSample);
        }
    
        [Guid("CE50FFF9-1BA8-4788-8131-BDE7D4FFC27F"),
        InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
        public interface IGenericSampleConfig
        {
            [PreserveSig]
            int SetMediaTypeFromBitmap(BitmapInfoHeader bmi, long lFPS);
    
            [PreserveSig]
            int SetMediaType([MarshalAs(UnmanagedType.LPStruct)] AMMediaType amt);
    
            [PreserveSig]
            int SetMediaTypeEx([MarshalAs(UnmanagedType.LPStruct)] AMMediaType amt, int lBufferSize);
    
            [PreserveSig]
            int SetBitmapCB(IGenericSampleCB pfn);
        }
    }
    
    祝你好运,试着让它运转起来。或者对其他问题发表评论,以便我们解决其他问题。