如何在averagePowerForChannel
中获取AVPlayer
才能在我的音乐应用上制作音频可视化效果!
香港专业教育学院已经完成了可视化部分,但我坚持使用它的引擎(实时体积通道)。
我知道使用AVAudioPlayer
可以使用.meteringEnabled
属性轻松完成,但由于某些已知原因AVPlayer
在我的应用中必不可少!
我实际上正在考虑使用AVAudioPlayer
与AVPlayer
一起获得所需的结果,但这听起来有点混乱,
这怎么会影响性能和稳定性?
提前谢谢
答案 0 :(得分:2)
我有AVPlayer
可视化问题大约两年了。在我的情况下,它涉及HLS直播,在这种情况下,根据我的知识,你不会让它运行。
编辑这不允许您访问averagePowerForChannel:
方法,但您可以访问原始数据,例如使用FFT获取所需信息。
我让它使用本地播放。你基本上等待玩家玩家项目有一个轨道启动和运行。此时,您需要将MTAudioProcessingTap
修补到音频混音中。
处理点按将运行您指定的回调,您可以根据需要计算原始音频数据。
这是一个简单的例子(抱歉在ObjC中存在):
#import <AVFoundation/AVFoundation.h>
#import <MediaToolbox/MediaToolbox.h>
void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut) {};
void finalize(MTAudioProcessingTapRef tap) {};
void prepare(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat) {};
void unprepare(MTAudioProcessingTapRef tap) {};
void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames, MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut, CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) {};
- (void)play {
// player and item setup ...
[[[self player] currentItem] addObserver:self forKeyPath:@"tracks" options:kNilOptions context:NULL];
}
//////////////////////////////////////////////////////
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)changecontext:(void *)context
if ([keyPath isEqualToString:@"tracks"] && [[object tracks] count] > 0) {
for (AVPlayerItemTrack *itemTrack in [object tracks]) {
AVAssetTrack *track = [itemTrack assetTrack];
if ([[track mediaType] isEqualToString:AVMediaTypeAudio]) {
[self addAudioProcessingTap:track];
break;
}
}
}
- (void)addAudioProcessingTap:(AVAssetTrack *)track {
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalise;
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks, kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err) {
NSLog(@"error: %@", [NSError errorWithDomain:NSOSStatusErrorDomain code:err userInfo:nil]);
return;
}
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
[inputParams setAudioTapProcessor:tap];
[audioMix setInputParameters:@[inputParams]];
[[[self player] currentItem] setAudioMix:audioMix];
}
两年前在my question上进行了一些讨论,所以请务必查看。
答案 1 :(得分:-2)
您需要将音频处理器类与AV Foundation结合使用,以显示音频样本以及将Core Audio音频单元效果(带通滤波器)应用于音频数据。你可以找到sample project on GitHub
基本上你需要向AVPlayer添加一个观察者,如下所示:
// Notifications
let playerItem: AVPlayerItem! = videoPlayer.currentItem
playerItem.addObserver(self, forKeyPath: "tracks", options: NSKeyValueObservingOptions.New, context: nil);
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: videoPlayer.currentItem, queue: NSOperationQueue.mainQueue(), usingBlock: { (notif: NSNotification) -> Void in
self.videoPlayer.seekToTime(kCMTimeZero)
self.videoPlayer.play()
print("replay")
})
然后在下面的overriden方法中处理通知:
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if (object === videoPlayer.currentItem && keyPath == "tracks"){
if let playerItem: AVPlayerItem = videoPlayer.currentItem {
if let tracks = playerItem.asset.tracks as? [AVAssetTrack] {
tapProcessor = MYAudioTapProcessor(AVPlayerItem: playerItem)
playerItem.audioMix = tapProcessor.audioMix
tapProcessor.delegate = self
}
}
}
}
的链接