为什么我的多声道调音台不再在iOS 8中播放?

时间:2015-01-05 02:30:43

标签: ios ios8 midi coremidi

我已经编写了一些代码来在iOS上播放多乐器通用MIDI文件。它在iOS 7中运行良好,但在iOS 8上停止工作。

我在这里剥夺了它的本质。我只创建一个采样器节点,并将所有轨道映射到该通道,而不是为我的多通道混音器创建16个通道。它仍然表现出与多采样器版本相同的问题。在iOS 7或iOS 8中,没有任何Audio Toolbox调用返回错误代码(它们都返回0)。序列通过iOS 7中的扬声器在模拟器和iPhone / iPad设备上播放。在iOS 8模拟器或iPhone / iPad设备上运行完全相同的代码,不会产生声音。

如果您对[self initGraphFromMIDISequence]的来电发表评论,则会在iOS 8上播放默认的正弦波声音。

@implementation MyMusicPlayer {
    MusicPlayer _musicPlayer;
    MusicSequence _musicSequence;
    AUGraph _processingGraph;
}

- (void)playMidi:(NSURL*)midiFileURL {
    NewMusicSequence(&_musicSequence);
    MusicSequenceFileLoad(_musicSequence, CFBridgingRetain(midiFileURL), 0, 0);

    NewMusicPlayer(&_musicPlayer);
    MusicPlayerSetSequence(_musicPlayer, _musicSequence);

    [self initGraphFromMIDISequence];

    MusicPlayerPreroll(_musicPlayer);
    MusicPlayerStart(_musicPlayer);
}

// Sets up an AUGraph with one channel whose instrument is loaded from a sound bank.
// Maps all the tracks of the MIDI sequence onto that channel.  Basically this is a
// way to replace the default sine-wave sound with another (single) instrument.
- (void)initGraphFromMIDISequence {
    NewAUGraph(&_processingGraph);

    // Add one sampler unit to the graph.
    AUNode samplerNode;
    AudioComponentDescription cd = {};
    cd.componentManufacturer = kAudioUnitManufacturer_Apple;
    cd.componentType = kAudioUnitType_MusicDevice;
    cd.componentSubType = kAudioUnitSubType_Sampler;
    AUGraphAddNode(_processingGraph, &cd, &samplerNode);

    // Add a Mixer unit node to the graph
    cd.componentType = kAudioUnitType_Mixer;
    cd.componentSubType = kAudioUnitSubType_MultiChannelMixer;
    AUNode mixerNode;
    AUGraphAddNode(_processingGraph, &cd, &mixerNode);

    // Add the Output unit node to the graph
    cd.componentType = kAudioUnitType_Output;
    cd.componentSubType = kAudioUnitSubType_RemoteIO; // Output to speakers.
    AUNode ioNode;
    AUGraphAddNode(_processingGraph, &cd, &ioNode);

    AUGraphOpen(_processingGraph);

    // Obtain the mixer unit instance from its corresponding node, and set the bus count to 1.
    AudioUnit mixerUnit;
    AUGraphNodeInfo(_processingGraph, mixerNode, NULL, &mixerUnit);
    UInt32 const numChannels = 1;
    AudioUnitSetProperty(mixerUnit,
                         kAudioUnitProperty_ElementCount,
                         kAudioUnitScope_Input,
                         0,
                         &numChannels,
                         sizeof(numChannels));

    // Connect the sampler node's output 0 to mixer node output 0.
    AUGraphConnectNodeInput(_processingGraph, samplerNode, 0, mixerNode, 0);

    // Connect the mixer unit to the output unit.
    AUGraphConnectNodeInput(_processingGraph, mixerNode, 0, ioNode, 0);

    // Obtain reference to the audio unit from its node.
    AudioUnit samplerUnit;
    AUGraphNodeInfo(_processingGraph, samplerNode, 0, &samplerUnit);
    MusicSequenceSetAUGraph(_musicSequence, _processingGraph);

    // Set the destination for each track to our single sampler node.
    UInt32 trackCount;
    MusicSequenceGetTrackCount(_musicSequence, &trackCount);
    MusicTrack track;
    for (int i = 0; i < trackCount; i++) {
      MusicSequenceGetIndTrack(_musicSequence, i, &track);
      MusicTrackSetDestNode(track, samplerNode);
    }

    // You can use either a DLS or an SF2 file bundled with your app; both work in iOS 7.
    //NSString *soundBankPath = [[NSBundle mainBundle] pathForResource:@"GeneralUserv1.44" ofType:@"sf2"];
    NSString *soundBankPath = [[NSBundle mainBundle] pathForResource:@"gs_instruments" ofType:@"dls"];
    NSURL *bankURL = [NSURL fileURLWithPath:soundBankPath];
    AUSamplerBankPresetData bpdata;
    bpdata.bankURL  = (__bridge CFURLRef) bankURL;
    bpdata.bankMSB  = kAUSampler_DefaultMelodicBankMSB;
    bpdata.bankLSB  = kAUSampler_DefaultBankLSB;
    bpdata.presetID = 0;
    UInt8 instrumentNumber = 46;  // pick any GM instrument 0-127
    bpdata.presetID = instrumentNumber;
    AudioUnitSetProperty(samplerUnit,
                         kAUSamplerProperty_LoadPresetFromBank,
                         kAudioUnitScope_Global,
                         0,
                         &bpdata,
                         sizeof(bpdata));
}

我有一些代码,不包含在这里,通过调用MusicPlayerGetTime实例上的MusicPlayer来轮询以查看序列是否仍在播放。在iOS 7中,每次调用的结果是自开始播放以来经过的秒数。在iOS 8中,调用始终返回0,这可能意味着MusicPlayer无法在调用MusicPlayerStart时开始播放序列。

上面的代码高度依赖于顺序 - 您必须先在其他人之前进行某些调用;例如,在节点上调用getInfo之前打开图形,而不是在将轨道分配给通道之前加载仪器。我已经遵循了其他StackOverflow线程中的所有建议,并且已经验证了订单的正确性会使错误代码消失。

任何iOS MIDI专家都知道iOS 7和iOS 8之间可能发生了哪些变化,导致此代码停止运行?

1 个答案:

答案 0 :(得分:1)

在iOS 8中,Apple推出了核心音频API-AVAudioEngine的光滑Obj-C抽象。 你应该检查一下。 https://developer.apple.com/videos/wwdc/2014/#502