IOS如何用Midi输入回调录制midi文件?

时间:2014-05-14 19:35:49

标签: ios objective-c midi coremidi

我尝试用Ipad录制midi文件。 我的Ipad充满了我的电钢琴的usb输出。

我已阅读apple core midi文档,我了解到: 为了记录文件,我应该创建一个MusicSequence。所以我试着这样做,但它不起作用:(

这是我的代码:

首先,我设置了midi连接:

-(void) setupMIDI {

MIDIClientRef client = nil;
MIDIClientCreate(CFSTR("Core MIDI to System Sounds Demo"), MyMIDINotifyProc, (__bridge void *)(self), &client);

inputPort = nil;
MIDIInputPortCreate(client, CFSTR("Input port"), MyMIDIReadProc, (__bridge void *)(self), &inputPort);

sequence = nil;
NewMusicSequence(&(sequence));

unsigned long sourceCount = MIDIGetNumberOfSources();
[self appendToTextView:[NSString stringWithFormat:@"%ld sources\n", sourceCount]];
for (int i = 0; i < sourceCount; ++i) {
    MIDIEndpointRef src = MIDIGetSource(i);
    CFStringRef endpointName = NULL;
    OSStatus nameErr = MIDIObjectGetStringProperty(src, kMIDIPropertyName, &endpointName);
    if (noErr == nameErr) {
        [self appendToTextView: [NSString stringWithFormat:@"  source %d: %@\n", i, endpointName]];
    }
    MIDIPortConnectSource(inputPort, src, NULL);
    MusicSequenceSetMIDIEndpoint(sequence, src);
}

}

之后,我收到MyMIDIReadProc的Midi事件,这是我的输入端口的回调函数:

static void MyMIDIReadProc(const MIDIPacketList *pktlist, void *refCon, void *connRefCon)
{

AppViewController *vc = (__bridge AppViewController*) refCon;

    MIDIPacket *packet = (MIDIPacket *)pktlist->packet;
    for (int i=0; i < pktlist->numPackets; i++) {
    Byte midiStatus = packet->data[0];
    Byte midiCommand = midiStatus >> 4;
    // is it a note-on or note-off
    if ((midiCommand == 0x09) ||
        (midiCommand == 0x08)) {
        Byte note = packet->data[1] & 0x7F;
        Byte velocity = packet->data[2] & 0x7F;
        NSLog(@"midiCommand=%d. Note=%d, Velocity=%d\n", midiCommand, note, velocity);

        MIDINoteMessage noteMessage;
        noteMessage.releaseVelocity = 0;
        noteMessage.velocity = velocity;
        noteMessage.note = note;

        MusicTrackNewMIDINoteEvent(vc->musicTrack, packet->timeStamp, &noteMessage);

        packet = MIDIPacketNext(packet);
    }


}

我尝试在MIDINoteMessage上转换MIDIPklist,将其添加到我的轨道上。

完成后,我用这个函数创建文件:

-(void) createMidiFile
{
// init sequence
NewMusicSequence(&sequence);
CFURLRef pathUrl = (__bridge CFURLRef)[NSURL fileURLWithPath:self.path];

//set track to sequence
MusicSequenceNewTrack(sequence, &musicTrack);

// write sequence in file
MusicSequenceFileCreate(sequence,
                        pathUrl,
                        kMusicSequenceFile_MIDIType,
                        kMusicSequenceFileFlags_EraseFile,
                        0);
}

文件已创建,但数据不正确。每次都有相同的尺寸。

谢谢你能帮我调试一下!我不明白我要做什么来填充轨道和序列对象来创建一个好的中间文件......

对不起我的英国人......:)

1 个答案:

答案 0 :(得分:0)

我试图解决同样的问题。从我所看到的 - MIDINoteMessage需要有一个持续时间,该持续时间对应于音符和随后音符关闭调用的增量。你必须跟踪这一点。

应该在主线程上执行回调,并且您需要在转出midi文件之前使用CACurrentMediaTime来存储midi时间戳的时间。下面的一些代码。

另一种替代方法来自苹果论坛

&#34;创建MusicSequence,为其添加MusicTrack,通过MusicTrackNewMidiNoteEvent将一些midi事件添加到音轨,在新创建的MusicPlayer上设置MusicSequence,然后启动播放器。现在您已经播放了该播放器,您可以通过MusicPlayerGetTime函数查询节拍中的当前时间。为您发送到MusicTrackNewMidiNoteEvent的midi消息设置MusicTimeStamp的时间。

**重要提示 - 您必须填写要查询时间戳的MusicPlayer的MusicTrack,否则它将无法播放!您将收到错误(可能是(-50),具体取决于您是否正确设置了其他所有内容)。我用一个循环做了这个,添加了一个时间戳从零开始的消息,最多四个。我猜你甚至不必去那么高,但是Track必须有一些让玩家可以玩的东西。不要担心它会跑掉,因为我们想要的就是MusicTimeStamp。 MusicPlayer将继续播放,直到你告诉它停止。&#34;

此代码是我最接近的答案 - 尽管这是针对IOS的。

https://github.com/benweitzman/ReTune/blob/ee47009999298c2b03527302c3fb6d7be17b10e2/Return4/ViewController.m

@interface NoteObject : NSObject

@property (nonatomic) int time;
@property (nonatomic) int note;
@property (nonatomic) int velocity;
@property (nonatomic) bool noteOn;

@end

- (void) midiSource:(PGMidiSource*)midi midiReceived:(const MIDIPacketList *)packetList
{
    [self performSelectorOnMainThread:@selector(addString:)
                           withObject:@"MIDI received:"
                        waitUntilDone:NO];

    const MIDIPacket *packet = &packetList->packet[0];
    for (int i = 0; i < packetList->numPackets; ++i)
    {
        //[self performSelectorOnMainThread:@selector(addString:)
        //                      withObject:[self StringFromPacket:packet]
        //                    waitUntilDone:NO];
        if (packet->length == 3) {
            if ((packet->data[0]&0xF0) == 0x90) {
                if (packet->data[2] != 0) {
                    [self noteOn:packet->data[1] withVelocity:packet->data[2]];
                } else {
                    [self noteOff:packet->data[1]];
                }
            } else if ((packet->data[0]&0xF0) == 0x80) {
                [self noteOff:packet->data[1]];
            }
        }
        packet = MIDIPacketNext(packet);
    }
}


    - (void) noteOff:(int)noteValue {
        //NSLog(@"off");
        if (noteValue>=0 && noteValue<127) {
            ALSource * source = [sources objectAtIndex:noteValue];
            ALSource *loopSource = [loopSources objectAtIndex:noteValue];
            if (source.playing || loopSource.playing) {
                [[fadingOut objectAtIndex:noteValue] release];
                [fadingOut replaceObjectAtIndex:noteValue withObject:[[NSNumber alloc] initWithBool:YES]];
                dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
                    float timeDone = 0;
                    float duration = 0.2;
                    float timeStep = 0.01;
                    float valStep = source.gain*timeStep/duration;
                    float loopStep = loopSource.gain*timeStep/duration;
                    while (timeDone < duration) {
                        if (![[fadingOut objectAtIndex:noteValue] boolValue]) break;
                        source.gain -= valStep;
                        loopSource.gain -= loopStep;
                        [NSThread sleepForTimeInterval:timeStep];
                        timeDone += timeStep;
                    }
                    if ([[fadingOut objectAtIndex:noteValue] boolValue]) {
                        [source stop];
                        [loopSource stop];
                    }
                    //source.gain = 1;
                });

                if (recording) {
                    double currentTime = CACurrentMediaTime();
                    int deltaTime = (int)(currentTime*1000-recordTimer*1000);
                    NoteObject * recordedNote = [[NoteObject alloc] init];
                    recordedNote.note = noteValue;
                    recordedNote.time = deltaTime;
                    recordedNote.noteOn = false;
                    [recordedNotes addObject:recordedNote];
                    recordTimer = currentTime;
                }
            }
        }
    }

    - (void) finishFadeIn:(ALSource*)source {

    }

    - (void) noteOn:(int)noteValue withVelocity:(int)velocity {
        if (noteValue>=0 && noteValue<127) {
            if (recording) {
                double currentTime = CACurrentMediaTime();
                int deltaTime = (int)(currentTime*1000-recordTimer*1000);
                NoteObject * recordedNote = [[NoteObject alloc] init];
                recordedNote.note = noteValue;
                recordedNote.time = deltaTime;
                recordedNote.noteOn = true;
                [recordedNotes addObject:recordedNote];
                recordTimer = currentTime;
            }
            while(loadingScale || changingPitch);
            float pitchToPlay = [[ratios objectAtIndex:noteValue] floatValue];
            [[fadingOut objectAtIndex:noteValue] release];
            [fadingOut replaceObjectAtIndex:noteValue withObject:[[NSNumber alloc] initWithBool:NO]];
            ALSource * source = [sources objectAtIndex:noteValue];
            [source stop];
            source.gain = velocity/127.0f;
            source.pitch = pitchToPlay;
            [source play:[buffers objectAtIndex:noteValue]];
            if ([loopBuffers objectAtIndex:noteValue] != (id)[NSNull null]) {
                ALSource *loopSource = [loopSources objectAtIndex:noteValue];
                [loopSource stop];
                loopSource.gain = 0;
                loopSource.pitch = source.pitch;
                [loopSource play:[loopBuffers objectAtIndex:noteValue] loop:YES];
                dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
                    float timeDone = 0;
                    float duration = [(ALBuffer*)[buffers objectAtIndex:noteValue] duration]-.4;
                    float timeStep = 0.01;
                    float valStep = source.gain*timeStep/duration;
                    float loopStep = valStep;
                    while (timeDone < duration) {
                        if ([[fadingOut objectAtIndex:noteValue] boolValue]) break;
                        source.gain -= valStep;
                        loopSource.gain += loopStep;
                        [NSThread sleepForTimeInterval:timeStep];
                        timeDone += timeStep;
                    }
                    /*if ([[fadingOut objectAtIndex:noteValue] boolValue]) {
                        [source stop];
                        [loopSource stop];
                    }*/
                    //source.gain = 1;
                });
            }
            /*
            [source play];*/
            //[[sources objectAtIndex:noteValue] play:toPlay gain:velocity/127.0f pitch:pitchToPlay pan:0.0f loop:FALSE];
        }
    }