AVPlayer播放单声道音频立体声 - >单声道

时间:2013-03-02 12:29:49

标签: ios cocoa-touch core-audio avplayer

在我的iPad / iPhone应用程序中,我正在使用AVPlayer播放视频。视频文件具有立体声音轨,但我只需要以单声道播放该音轨的一个声道。部署目标是iOS 6.我如何实现这一目标?非常感谢你的帮助。

1 个答案:

答案 0 :(得分:1)

我现在终于找到了这个问题的答案 - 至少在iOS 6上进行部署。您可以轻松地将 MTAudioProcessingTap 添加到现有的AVPlayer项目中,并将选定的频道样本复制到另一个频道。你的进程回调函数。这是一个很好的教程,解释了基础知识:http://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/

到目前为止,这是我的代码,主要是从上面的链接中复制的。

在AVPlayer设置期间,我为音频处理分配回调函数:

MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = ( void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;

MTAudioProcessingTapRef tap;
// The create function makes a copy of our callbacks struct
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
                                          kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err || !tap) {
    NSLog(@"Unable to create the Audio Processing Tap");
    return;
}
assert(tap);

// Assign the tap to the input parameters
audioInputParam.audioTapProcessor = tap;

// Create a new AVAudioMix and assign it to our AVPlayerItem
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = @[audioInputParam];
playerItem.audioMix = audioMix;

以下是音频处理功能(实际上是唯一需要处理的功能):

#pragma mark Audio Processing

void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut) {
    NSLog(@"Initialising the Audio Tap Processor");
    *tapStorageOut = clientInfo;
}

void finalize(MTAudioProcessingTapRef tap) {
    NSLog(@"Finalizing the Audio Tap Processor");
}

void prepare(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat) {
    NSLog(@"Preparing the Audio Tap Processor");
}

void unprepare(MTAudioProcessingTapRef tap) {
    NSLog(@"Unpreparing the Audio Tap Processor");
}

void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames,
         MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut,
         CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) {
    OSStatus err = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, NULL, numberFramesOut);
    if (err) NSLog(@"Error from GetSourceAudio: %ld", err);

    SIVSViewController* self = (SIVSViewController*) MTAudioProcessingTapGetStorage(tap);

    if (self.selectedChannel) {

        int channel = self.selectedChannel;

        if (channel == 0) {
            bufferListInOut->mBuffers[1].mData = bufferListInOut->mBuffers[0].mData;
        } else {
            bufferListInOut->mBuffers[0].mData = bufferListInOut->mBuffers[1].mData;
        }
    }
}