混音器AudioUnit到RemoteIO AudioUnit

时间:2014-08-02 13:28:04

标签: ios core-audio audiounit

我有一个AudioUnit,对应的Callback工作正常,但现在,我需要将它发送到RemoteIO,因为我实现了一些需要RemoteIO AudioUnit工作的框架。

然后......我需要相同的输出,我需要使用这个audiounit调音台,但需要另一个类型为kAudioUnitSubType_RemoteIO的音频。

请帮忙!

编辑...... 这是我尝试的代码...... 编辑2- iOUnitDescription已添加

AudioComponentDescription iOUnitDescription;
iOUnitDescription.componentType          = kAudioUnitType_Output;
iOUnitDescription.componentSubType       = kAudioUnitSubType_RemoteIO;
iOUnitDescription.componentManufacturer  = kAudioUnitManufacturer_Apple;
iOUnitDescription.componentFlags         = 0;
iOUnitDescription.componentFlagsMask     = 0;


AudioComponent foundIoUnitReference = AudioComponentFindNext (
                                                              NULL,
                                                              &iOUnitDescription
                                                              );
AudioComponentInstanceNew (
                           foundIoUnitReference,
                           &audioUnit
                           );

result = AudioUnitSetProperty (
                               audioUnit,
                               kAudioUnitProperty_StreamFormat,
                               kAudioUnitScope_Input,
                               guitarBus,
                               &stereoStreamFormat,
                               sizeof (stereoStreamFormat)
                               );

if (noErr != result) {[self printErrorMessage: @"AudioUnitSetProperty (set mixer unit guitar input bus stream format)" withStatus: result];return;}
result = AudioUnitSetProperty (
                               audioUnit,
                               kAudioUnitProperty_SampleRate,
                               kAudioUnitScope_Output,
                               0,
                               &graphSampleRate,
                               sizeof (graphSampleRate)
                               );
if (noErr != result) {[self printErrorMessage: @"AudioUnitSetProperty (set AUDIOUNIT unit output stream format)" withStatus: result]; return;}


AudioUnitElement mixerUnitOutputBus  = 0;
AudioUnitElement ioUnitOutputElement = 0;

AudioUnitConnection mixerOutToIoUnitIn;
mixerOutToIoUnitIn.sourceAudioUnit    = mixerUnit;
mixerOutToIoUnitIn.sourceOutputNumber = mixerUnitOutputBus;
mixerOutToIoUnitIn.destInputNumber    = ioUnitOutputElement;

AudioUnitSetProperty (
                      audioUnit,                     // connection destination
                      kAudioUnitProperty_MakeConnection,  // property key
                      kAudioUnitScope_Input,              // destination scope
                      ioUnitOutputElement,                // destination element
                      &mixerOutToIoUnitIn,                // connection definition
                      sizeof (mixerOutToIoUnitIn)
                      );

1 个答案:

答案 0 :(得分:2)

我真的需要更多信息。从上面看,我看到你有一个混音器,一个guitarBus,大概是你的输入(似乎是一个流)。 & iOUnitDescription的定义是什么?更重要的是,你在哪里挂钩你的renderCallback,你在回调中做了什么以及框架期望什么?

通常,当我需要处理音频时,我会构建自己的图形;我把它作为自己的类来实现更好的可移植性。这应该是一个很好的起点

以下是我实施此类解决方案的方法。

 // header file

  @interface MDMixerGraph : NSObject{
    AUGraph graph;
    AudioUnit mixerUnit;
    AudioUnit inputUnit;
    AudioUnit rioUnit;
   }
  -(void) setupAUGraph;
  @end

  // implementation

  @implementation MDMixerGraph

  // exception Helper 
  void MDThrowOnError(OSStatus status){
  if (status != noErr) {
      @throw [NSException exceptionWithName:@"MDMixerException"
                                   reason:[NSString stringWithFormat:@"Status Error %d).", (int)status]
                                    userInfo:nil];
     }
  }


   // helper method for setting up graph nodes
   OSStatus MDAdAUGraphdNode(OSType inComponentType, OSType inComponentSubType, AUGraph inGraph, AUNode *outNode)
   {
     AudioComponentDescription desc;
     desc.componentType = inComponentType;
     desc.componentSubType = inComponentSubType;
     desc.componentFlags = 0;
     desc.componentFlagsMask = 0;
     desc.componentManufacturer = kAudioUnitManufacturer_Apple;
     return AUGraphAddNode(inGraph, &desc, outNode);
   }

   // setup method to init and start AUGraph
   -(void) setupAUGraph{

    //Create the Graph
    MDThrowOnError(NewAUGraph(&graph));

   // setup AU Units
   // Add Audio Units (Nodes) to the graph
   AUNode inputNode, rioNode, mixerNode;

//Input Node -- this may need to be a different type to accept your Stream (not enough info above) 
MDThrowOnError(MDAdAUGraphdNode(kAudioUnitType_Output, kAudioUnitSubType_RemoteIO, graph, &inputUnit));

//Remote IO Node - your output node
MDThrowOnError(MDAdAUGraphdNode(kAudioUnitType_Output, kAudioUnitSubType_RemoteIO, graph, &rioNode));

//mixerNode - Depending on output and input change the mixer sub-type here
// you can configure additional nodes depending on your needs for inputs and outputs
MDThrowOnError(MDAdAUGraphdNode(kAudioUnitType_Mixer, kAudioUnitSubType_AU3DMixerEmbedded, graph, &mixerNode));

// open graph
MDThrowOnError(AUGraphOpen(graph));

// we need a ref to the Audio Units so lets grab all of them here

MDThrowOnError(AUGraphNodeInfo(graph, inputNode, NULL, &inputUnit));
MDThrowOnError(AUGraphNodeInfo(graph, rioNode, NULL, &rioUnit));
MDThrowOnError(AUGraphNodeInfo(graph, mixerNode, NULL, &mixerUnit));

// setup the connections here, input to output of the graph.
/// the graph looks like inputNode->mixerNode->rioNode

MDThrowOnError(AUGraphConnectNodeInput(graph, inputNode, 0, mixerNode, 0));
MDThrowOnError(AUGraphConnectNodeInput(graph, mixerNode, 0, rioNode, 0));

// Init the graph

MDThrowOnError(AUGraphInitialize(graph));

//do any other setup here for your stream    

// Finally, Start the graph

MDThrowOnError(AUGraphStart(graph));

}

在View Controller扩展程序中,您只需;

  // define the MDMixerGraph Class
  // @property (nonatomic) MDMixerGraph *mixer;

并在实施中

  self.mixer = [[MDMixerGraph alloc]init];
 [self.mixer setupAUGraph];

你已经引用了rioUnit来传递给你的框架(self.mixer.rioUnit);在不了解您的要求连接/处理的情况下,这是我能为您做的最好的事情。

干杯!