Objective-c将参数传递给IBAction

时间:2014-04-04 07:13:34

标签: ios objective-c

我正在为iPhone SDK上的实时音频处理开发,我使用了' EZAudio'从麦克风获取数据,然后用高通滤波器过滤它们,最后在“EZAudio'”的帮助下绘制它们。

我设置了两个按钮来绘制原始波形和滤波波形。我不知道如何将数据从麦克风传递到 - (IBAction)?

这是我的代码:

来自microphoneDelegate

的数据
-(void)    microphone:(EZMicrophone *)microphone
     hasAudioReceived:(float **)buffer
       withBufferSize:(UInt32)bufferSize
 withNumberOfChannels:(UInt32)numberOfChannels {
dispatch_async(dispatch_get_main_queue(), ^{

    // Update time domain plot
    [self.audioPlotTime updateBuffer:buffer[0]
                      withBufferSize:bufferSize];

    });
}

我写过的高通滤波器:

-(void)processSampleForHighOrLowPassFilter:(float)bufferSize
                             withAudioData:(float*)inBuffer
                            withFilterType:(UInt32)filterType {

    float *outBuffer = (float *)malloc(sizeof(float)*bufferSize);

    for (int i=0; i<bufferSize; i++) {

        outBuffer[i] = (a0 * (inBuffer[i])) 
                       + (a1 * tmpBufferInIndex[0])
                       + (a2 * tmpBufferInIndex[1])
                       - (b1 * tmpBufferOutIndex[0]) 
                       - (b2 * tmpBufferOutIndex[1]);

        tmpBufferInIndex[1] = tmpBufferInIndex[0];
        tmpBufferInIndex[0] = inBuffer[i];
        tmpBufferOutIndex[1] = tmpBufferOutIndex[0];
        tmpBufferOutIndex[0] = outBuffer[i];
   }

   [self.audioPlotHighPassFilter8kHz updateBuffer:outBuffer
                                    withBufferSize:bufferSize];
}

我想用一个按钮触发原始波形和处理后的波形

-(void)plotOriginalWaveForm:(id)sender {
    //[self.audioPlot updateBuffer:outBuffer
    //              withBufferSize:bufferSize];
}

-(void)plot8kHzWaveForm:(id)sender {
    //[self.audioPlotHighPassFilter8kHz updateBuffer:outBuffer
    //                                withBufferSize:bufferSize];
}

但是我不明白如何将bufferbufferSize传递给IBAction,因为我会在其中调用plot函数,而plot函数需要这两个参数。

感谢您提供一些建议,或者有其他方法可以使用IBAction中麦克风的数据!

1 个答案:

答案 0 :(得分:2)

将缓冲区和缓冲区信息存储为类中的属性,在麦克风委托中调用时设置它们,并在绘制波形时从属性中检索它们,而不是作为参数传递。

编辑以提供可能的示例,具体取决于原始代码周围的类:

在@interface块的.h文件中:

{
    float ** myBuffer;
    UInt32 myBufferSize;
}

在你的方法中:

-(void)    microphone:(EZMicrophone *)microphone
     hasAudioReceived:(float **)buffer
       withBufferSize:(UInt32)bufferSize
 withNumberOfChannels:(UInt32)numberOfChannels {

    self.myBuffer = buffer;
    self.myBufferSize = bufferSize;
dispatch_async(dispatch_get_main_queue(), ^{

    // Update time domain plot
    [self.audioPlotTime updateBuffer:buffer[0]
                  withBufferSize:bufferSize];

     });
}

-(void)plotOriginalWaveForm:(id)sender {
    [self.audioPlot myBuffer
                  myBufferSize];
}

-(void)plot8kHzWaveForm:(id)sender {
    [self.audioPlotHighPassFilter8kHz myBuffer
                                withBufferSize:myBufferSize];
}