音频队列的工作方式与iOS 10中的预期不同

时间:2017-06-13 09:59:30

标签: ios objective-c audioqueueservices

更新

我解决了在iOS 10中录制的问题。在开始录制之前添加音频会话配置后,它正常工作。但是播放还没有解决。

以下是解决方案:

NSError *error = nil;
// the param category depends what you need
BOOL ret = [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
if (!ret) {  
    NSLog(@"Audio session category setup failed");  
    return;  
}
// don't forget to setActive NO when finishing recording
ret = [[AVAudioSession sharedInstance] setActive:YES error:&error];  
if (!ret)  
{  
    NSLog(@"Audio session activation failed");  
    return;  
}  

原始

我在iOS中使用音频队列服务进行录音。我按照苹果公司的官方教程来实现录音部分和播放部分。它在iOS 9.3中在模拟器中成功测试,但在iOS 10.3.1中在真实设备iPad中失败。

对于录音部分,回调函数调用AudioFileWritePackets将音频保存到文件中(参见下面的代码)。在iOS 9中,ioNumPackets始终具有非零值,但在iOS 10中,在第一次录制期间始终为0,从第二次开始变为正常。也就是说,仅从第二次录音开始。

这里有一些关于录制的代码:

回调功能:

static void AudioInputCallback(void * inUserData, AudioQueueRef inAQ, AudioQueueBufferRef inBuffer, const AudioTimeStamp * inStartTime, UInt32 inNumPackets, const AudioStreamPacketDescription * inPacketDescs) {
    NSLog(@"Input callback called");
    RecordState * aqData = (RecordState*)inUserData;
    if (aqData->isRecording == 0) return;
    if (inNumPackets == 0 && aqData->dataFormat.mBytesPerPacket != 0)
        inNumPackets = inBuffer->mAudioDataByteSize / aqData->dataFormat.mBytesPerPacket;
    NSLog(@"inNumPackets = %d", inNumPackets);
    // handler the data
    if (outputToMobile){
        OSStatus res = AudioFileWritePackets(aqData->audioFile, false, inBuffer->mAudioDataByteSize, inPacketDescs, aqData->currentPacket, &inNumPackets, inBuffer->mAudioData);
        if(res == noErr)
            aqData->currentPacket += inNumPackets;
    }else{ 
    }
    // after handling, re-enqueue de buffer into the queue
    AudioQueueEnqueueBuffer(aqData->queue, inBuffer, 0, NULL);
}

开始记录功能:

-(void)startRecording{
    [self setupAudioFormat:&recordState.dataFormat];
    recordState.currentPacket = 0;
    OSStatus status;
    status = AudioQueueNewInput(&recordState.dataFormat, AudioInputCallback, &recordState, CFRunLoopGetCurrent(), kCFRunLoopCommonModes, 0, &recordState.queue);
    if (status == 0) {
        UInt32 dataFormatSize = sizeof (recordState.dataFormat);
        AudioQueueGetProperty (recordState.queue,kAudioQueueProperty_StreamDescription,&recordState.dataFormat,&dataFormatSize);
        if (outputToMobile) {
            [self createFile];
            SetMagicCookieForFile(recordState.queue, recordState.audioFile);
        }
        DeriveBufferSize(recordState.queue, &recordState.dataFormat, 0.5,  &recordState.bufferByteSize);
        for (int i = 0; i < NUM_BUFFERS; i++) {
            AudioQueueAllocateBuffer(recordState.queue, recordState.bufferByteSize, &recordState.buffers[i]);
            AudioQueueEnqueueBuffer(recordState.queue, recordState.buffers[i], 0, NULL);
        }
        recordState.isRecording = true;
        AudioQueueStart(recordState.queue, NULL);
    }
}

对于播放部分,回调函数调用AudioFileReadPacketData来读取音频文件(参见下面的代码)。同样,在iOS 9中,ioNumPackets始终为非零,但在iOS 10中,ioNumPackets始终为0,因此不会从iOS 10输出任何内容。

这里有一些关于播放的代码:

回调功能:

static void AudioOutputCallback(void *inUserData, AudioQueueRef inAQ, AudioQueueBufferRef inBuffer){
    NSLog(@"Output callback called");
    PlayState *aqData = (PlayState *)inUserData;
    if (aqData->isPlaying == 0) return;
    UInt32 numBytesReadFromFile;
    UInt32 numPackets = aqData->numPacketsToRead;
    AudioFileReadPacketData(aqData->audioFile, false, &numBytesReadFromFile, aqData->packetDesc, aqData->currentPacket, &numPackets, inBuffer->mAudioData);
    NSLog(@"outNumPackets = %d", numPackets);
    if (numPackets > 0) {
        AudioQueueEnqueueBuffer(aqData->queue, inBuffer, aqData->packetDesc ? numPackets : 0, aqData->packetDesc);
        aqData->currentPacket += numPackets;
    } else {
        AudioQueueStop(aqData->queue, false);
        aqData->isPlaying = false;
    }
}

开始播放功能:

- (void)startPlaying{
    playState.currentPacket = 0;
    [self openFile];
    UInt32 dataFormatSize = sizeof(playState.dataFormat);

    AudioFileGetProperty(playState.audioFile, kAudioFilePropertyDataFormat, &dataFormatSize, &playState.dataFormat);

    OSStatus status;
    status = AudioQueueNewOutput(&playState.dataFormat, AudioOutputCallback, &playState, CFRunLoopGetCurrent(), kCFRunLoopCommonModes, 0, &playState.queue);
    if (status == 0) {
        playState.isPlaying = true;
        UInt32 maxPacketSize;
        UInt32 propertySize = sizeof(maxPacketSize);
        AudioFileGetProperty(playState.audioFile,kAudioFilePropertyPacketSizeUpperBound,&propertySize,&maxPacketSize);

        DeriveBufferSize(playState.dataFormat, maxPacketSize, 0.5, &playState.bufferByteSize, &playState.numPacketsToRead);

        bool isFormatVBR = (playState.dataFormat.mBytesPerPacket == 0 ||playState.dataFormat.mFramesPerPacket == 0);
        if (isFormatVBR) {
            playState.packetDesc = (AudioStreamPacketDescription*) malloc (playState.numPacketsToRead * sizeof(AudioStreamPacketDescription));
        } else {
            playState.packetDesc = NULL;
        }
        //Set a Magic Cookie for a Playback Audio Queue
        MyCopyEncoderCookieToQueue(playState.audioFile, playState.queue);

        for (int i = 0; i < NUM_BUFFERS; i++) {
            AudioQueueAllocateBuffer(playState.queue, playState.bufferByteSize, &playState.buffers[i]);
            playState.buffers[i]->mAudioDataByteSize = playState.bufferByteSize;
            AudioOutputCallback(&playState, playState.queue, playState.buffers[i]);
        }
        Float32 gain = 10.0;
        AudioQueueSetParameter(playState.queue, kAudioQueueParam_Volume, gain);
        AudioQueueStart(playState.queue, NULL);
    }

}

这种不相容性真让我好几天。如果您需要更多详细信息,请随时问我。我希望有人可以帮助我。非常感谢。

0 个答案:

没有答案