AVAudioPCMBuffer用于音乐文件

时间:2015-08-21 18:14:33

标签: objective-c macos sprite-kit avaudiopcmbuffer

我一直在尝试在我的SpriteKit游戏中播放音乐,并使用AVAudioPlayerNode类通过AVAudioPCMBuffer来完成。每次我导出我的OS X项目时,它都会崩溃并给我一个关于音频播放的错误。在过去的24小时里,我的头靠在墙上后,我决定重新观看WWDC session 501(见54:17)。我解决这个问题的方法是演示者使用的方法,即将缓冲区的帧分成更小的片段以分解正在读取的音频文件。

NSError *error = nil;
NSURL *someFileURL = ...
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading: someFileURL commonFormat: AVAudioPCMFormatFloat32 interleaved: NO error:&error];
const AVAudioFrameCount kBufferFrameCapacity = 128 * 1024L;
AVAudioFramePosition fileLength = audioFile.length;

AVAudioPCMBuffer *readBuffer = [[AvAudioPCMBuffer alloc] initWithPCMFormat: audioFile.processingFormat frameCapacity: kBufferFrameCapacity];
while (audioFile.framePosition < fileLength) {
    AVAudioFramePosition readPosition = audioFile.framePosition;
    if (![audioFile readIntoBuffer: readBuffer error: &error])
        return NO;
    if (readBuffer.frameLength == 0) //end of file reached
        break;
}

我目前的问题是播放器只播放读入缓冲区的最后一帧。我正在播放的音乐只有2分钟。显然,这太长了,不能直接读入缓冲区。每次在循环内调用readIntoBuffer:方法时,缓冲区是否被覆盖?我就是这样的菜鸟......我怎样才能播放整个文件?

如果我无法使用此功能,那么在多个SKScene中播放音乐(2个不同的文件)的好方法是什么?

1 个答案:

答案 0 :(得分:2)

这是我提出的解决方案。它仍然不完美,但希望它能帮助那些处于我所处的困境中的人。我创建了一个单身人士班来处理这项工作。可以在将来进行的一项改进是仅在需要时加载特定SKScene所需的声音效果和音乐文件。我对这段代码有很多问题,我现在不想搞砸它。目前,我没有太多的声音,所以它没有使用过多的内存。

<强>概述
我的策略如下:

  1. 将游戏的音频文件名存储在plist中
  2. 从该plist中读取并创建两个词典(一个用于音乐,一个用于短音效)
  3. 音效词典由AVAudioPCMBuffer和每个声音的AVAudioPlayerNode组成
  4. 音乐词典由AVAudioPCMBuffers数组组成,这是一个时间戳数组,用于何时应该在队列中播放这些缓冲区,AVAudioPlayerNode和原始音频文件的采样率

    • 采样率是确定每个缓冲区应该播放的时间所必需的(您将看到在代码中完成的计算)
  5. 创建AVAudioEngine并从引擎获取主混音器并将所有AVAudioPlayerNodes连接到调音台(按照惯例)

  6. 使用各种方法播放音效或音乐
    • 音效播放很简单...通话方法-(void) playSfxFile:(NSString*)file; 它发出声音
    • 对于音乐,我只是找不到一个好的解决方案,而不需要尝试播放音乐的场景帮助。场景将调用-(void) playMusicFile:(NSString*)file;,它将调度缓冲区以便创建它们。在AudioEngine课程中完成后,我无法找到一种让音乐重复的好方法,因此我决定让场景检查其update:方法音乐是否正在播放对于特定文件,如果没有,再次播放(不是一个非常光滑的解决方案,但它的工作原理)
  7. AudioEngine.h

    #import <Foundation/Foundation.h>
    
    @interface AudioEngine : NSObject
    
    +(instancetype)sharedData;
    -(void) playSfxFile:(NSString*)file;
    -(void) playMusicFile:(NSString*)file;
    -(void) pauseMusic:(NSString*)file;
    -(void) unpauseMusic:(NSString*)file;
    -(void) stopMusicFile:(NSString*)file;
    -(void) setVolumePercentages;
    -(bool) isPlayingMusic:(NSString*)file;
    
    @end
    

    AudioEngine.m

    #import "AudioEngine.h"
    #import <AVFoundation/AVFoundation.h>
    #import "GameData.h" //this is a class that I use to store game data (in this case it is being used to get the user preference for volume amount)
    
    @interface AudioEngine()
    
    @property AVAudioEngine *engine;
    @property AVAudioMixerNode *mixer;
    
    @property NSMutableDictionary *musicDict;
    @property NSMutableDictionary *sfxDict;
    
    @property NSString *audioInfoPList;
    
    @property float musicVolumePercent;
    @property float sfxVolumePercent;
    @property float fadeVolume;
    @property float timerCount;
    
    @end
    
    @implementation AudioEngine
    
    int const FADE_ITERATIONS = 10;
    static NSString * const MUSIC_PLAYER = @"player";
    static NSString * const MUSIC_BUFFERS = @"buffers";
    static NSString * const MUSIC_FRAME_POSITIONS = @"framePositions";
    static NSString * const MUSIC_SAMPLE_RATE = @"sampleRate";
    
    static NSString * const SFX_BUFFER = @"buffer";
    static NSString * const SFX_PLAYER = @"player";
    
    +(instancetype) sharedData {
        static AudioEngine *sharedInstance = nil;
    
        static dispatch_once_t onceToken;
        dispatch_once(&onceToken, ^{
            sharedInstance = [[self alloc] init];
            [sharedInstance startEngine];
        });
    
        return sharedInstance;
    }
    
    -(instancetype) init {
        if (self = [super init]) {
            _engine = [[AVAudioEngine alloc] init];
            _mixer = [_engine mainMixerNode];
    
            _audioInfoPList = [[NSBundle mainBundle] pathForResource:@"AudioInfo" ofType:@"plist"]; //open a plist called AudioInfo.plist
    
            [self setVolumePercentages]; //this is created to set the user's preference in terms of how loud sound fx and music should be played
            [self initMusic];
            [self initSfx];
        }
        return self;
    }
    
    //opens all music files, creates multiple buffers depending on the length of the file and a player
    -(void) initMusic {
        _musicDict = [NSMutableDictionary dictionary];
    
        _audioInfoPList = [[NSBundle mainBundle] pathForResource: @"AudioInfo" ofType: @"plist"];
        NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];
    
        for (NSString *musicFileName in audioInfoData[@"music"]) {
            [self loadMusicIntoBuffer:musicFileName];
            AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
            [_engine attachNode:player];
    
            AVAudioPCMBuffer *buffer = [[_musicDict[musicFileName] objectForKey:MUSIC_BUFFERS] objectAtIndex:0];
            [_engine connect:player to:_mixer format:buffer.format];
            [_musicDict[musicFileName] setObject:player forKey:@"player"];
        }
    }
    
    //opens a music file and creates an array of buffers
    -(void) loadMusicIntoBuffer:(NSString *)filename
    {
        NSURL *audioFileURL = [[NSBundle mainBundle] URLForResource:filename withExtension:@"aif"];
        //NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:@"aif"]];
        NSAssert(audioFileURL, @"Error creating URL to audio file");
        NSError *error = nil;
        AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
        NSAssert(audioFile != nil, @"Error creating audioFile, %@", error.localizedDescription);
    
        AVAudioFramePosition fileLength = audioFile.length; //frame length of the audio file
        float sampleRate = audioFile.fileFormat.sampleRate; //sample rate (in Hz) of the audio file
        [_musicDict setObject:[NSMutableDictionary dictionary] forKey:filename];
        [_musicDict[filename] setObject:[NSNumber numberWithDouble:sampleRate] forKey:MUSIC_SAMPLE_RATE];
    
        NSMutableArray *buffers = [NSMutableArray array];
        NSMutableArray *framePositions = [NSMutableArray array];
    
        const AVAudioFrameCount kBufferFrameCapacity = 1024 * 1024L; //the size of my buffer...can be made bigger or smaller 512 * 1024L would be half the size
        while (audioFile.framePosition < fileLength) { //each iteration reads in kBufferFrameCapacity frames of the audio file and stores it in a buffer
            [framePositions addObject:[NSNumber numberWithLongLong:audioFile.framePosition]];
            AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:kBufferFrameCapacity];
            if (![audioFile readIntoBuffer:readBuffer error:&error]) {
                NSLog(@"failed to read audio file: %@", error);
                return;
            }
            if (readBuffer.frameLength == 0) { //if we've come to the end of the file, end the loop
                break;
            }
            [buffers addObject:readBuffer];
        }
    
        [_musicDict[filename] setObject:buffers forKey:MUSIC_BUFFERS];
        [_musicDict[filename] setObject:framePositions forKey:MUSIC_FRAME_POSITIONS];
    }
    
    -(void) initSfx {
        _sfxDict = [NSMutableDictionary dictionary];
    
        NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];
    
        for (NSString *sfxFileName in audioInfoData[@"sfx"]) {
            AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
            [_engine attachNode:player];
    
            [self loadSoundIntoBuffer:sfxFileName];
            AVAudioPCMBuffer *buffer = [_sfxDict[sfxFileName] objectForKey:SFX_BUFFER];
            [_engine connect:player to:_mixer format:buffer.format];
            [_sfxDict[sfxFileName] setObject:player forKey:SFX_PLAYER];
        }
    }
    
    //WARNING: make sure that the sound fx file is small (roughly under 30 sec) otherwise the archived version of the app will crash because the buffer ran out of space
    -(void) loadSoundIntoBuffer:(NSString *)filename
    {
        NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:@"mp3"]];
        NSAssert(audioFileURL, @"Error creating URL to audio file");
        NSError *error = nil;
        AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
        NSAssert(audioFile != nil, @"Error creating audioFile, %@", error.localizedDescription);
    
        AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:(AVAudioFrameCount)audioFile.length];
        [audioFile readIntoBuffer:readBuffer error:&error];
    
        [_sfxDict setObject:[NSMutableDictionary dictionary] forKey:filename];
        [_sfxDict[filename] setObject:readBuffer forKey:SFX_BUFFER];
    }
    
    -(void)startEngine {
        [_engine startAndReturnError:nil];
    }
    
    -(void) playSfxFile:(NSString*)file {
        AVAudioPlayerNode *player = [_sfxDict[file] objectForKey:@"player"];
        AVAudioPCMBuffer *buffer = [_sfxDict[file] objectForKey:SFX_BUFFER];
        [player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:nil];
        [player setVolume:1.0];
        [player setVolume:_sfxVolumePercent];
        [player play];
    }
    
    -(void) playMusicFile:(NSString*)file {
        AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
    
        if ([player isPlaying] == NO) {
            NSArray *buffers = [_musicDict[file] objectForKey:MUSIC_BUFFERS];
    
            double sampleRate = [[_musicDict[file] objectForKey:MUSIC_SAMPLE_RATE] doubleValue];
    
    
            for (int i = 0; i < [buffers count]; i++) {
                long long framePosition = [[[_musicDict[file] objectForKey:MUSIC_FRAME_POSITIONS] objectAtIndex:i] longLongValue];
                AVAudioTime *time = [AVAudioTime timeWithSampleTime:framePosition atRate:sampleRate];
    
                AVAudioPCMBuffer *buffer  = [buffers objectAtIndex:i];
                [player scheduleBuffer:buffer atTime:time options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
                    if (i == [buffers count] - 1) {
                        [player stop];
                    }
                }];
                [player setVolume:_musicVolumePercent];
                [player play];
            }
        }
    }
    
    -(void) stopOtherMusicPlayersNotNamed:(NSString*)file {
        if ([file isEqualToString:@"menuscenemusic"]) {
            AVAudioPlayerNode *player = [_musicDict[@"levelscenemusic"] objectForKey:MUSIC_PLAYER];
            [player stop];
        }
        else {
            AVAudioPlayerNode *player = [_musicDict[@"menuscenemusic"] objectForKey:MUSIC_PLAYER];
            [player stop];
        }
    }
    
    //stops the player for a particular sound
    -(void) stopMusicFile:(NSString*)file {
        AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
    
        if ([player isPlaying]) {
            _timerCount = FADE_ITERATIONS;
            _fadeVolume = _musicVolumePercent;
            [self fadeOutMusicForPlayer:player]; //fade out the music
        }
    }
    
    //helper method for stopMusicFile:
    -(void) fadeOutMusicForPlayer:(AVAudioPlayerNode*)player {
        [NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:@selector(handleTimer:) userInfo:player repeats:YES];
    }
    
    //helper method for stopMusicFile:
    -(void) handleTimer:(NSTimer*)timer {
        AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
        if (_timerCount > 0) {
            _timerCount--;
            AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
            _fadeVolume = _musicVolumePercent * (_timerCount / FADE_ITERATIONS);
            [player setVolume:_fadeVolume];
        }
        else {
            [player stop];
            [player setVolume:_musicVolumePercent];
            [timer invalidate];
        }
    }
    
    -(void) pauseMusic:(NSString*)file {
        AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
        if ([player isPlaying]) {
            [player pause];
        }
    }
    
    -(void) unpauseMusic:(NSString*)file {
        AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
        [player play];
    }
    
    //sets the volume of the player based on user preferences in GameData class
    -(void) setVolumePercentages {
        NSString *musicVolumeString = [[GameData sharedGameData].settings objectForKey:@"musicVolume"];
        _musicVolumePercent = [[[musicVolumeString componentsSeparatedByCharactersInSet:
        [[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
        componentsJoinedByString:@""] floatValue] / 100;
        NSString *sfxVolumeString = [[GameData sharedGameData].settings objectForKey:@"sfxVolume"];
        _sfxVolumePercent = [[[sfxVolumeString componentsSeparatedByCharactersInSet:
        [[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
        componentsJoinedByString:@""] floatValue] / 100;
    
        //immediately sets music to new volume
        for (NSString *file in [_musicDict allKeys]) {
            AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
            [player setVolume:_musicVolumePercent];
        }
    }
    
    -(bool) isPlayingMusic:(NSString *)file {
        AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
        if ([player isPlaying])
            return YES;
        return NO;
    }
    
    @end