我制作了一个视频播放器,正在分析当前正在播放的视频中的实时音频和视频曲目。视频存储在iOS设备上(在Apps Documents目录中)。
一切正常。我使用MTAudioProcessingTap来获取所有音频样本并进行一些FFT,我只是通过从当前播放的CMTime(AVPlayer currentTime属性)中复制像素缓冲区来分析视频。正如我所说,这很好。
但现在我想支持Airplay。只是airplay本身并不困难,但是一旦Airplay切换并且视频正在ATV播放,我的水龙头就会停止工作。不知何故,MTAudioProcessingTap将不会处理,像素缓冲区全部为空...我无法获取数据。
有没有办法获得这些数据?
为了获得像素缓冲区,我只需每隔几毫秒触发一个事件并检索播放器的currentTime。然后:
CVPixelBufferRef imageBuffer = [videoOutput copyPixelBufferForItemTime:time itemTimeForDisplay:nil];
CVPixelBufferLockBaseAddress(imageBuffer,0);
uint8_t *tempAddress = (uint8_t *) CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
tempAddress
是我的pixelbuffer,而videoOutput
是AVPlayerItemVideoOutput
的实例。
对于音频,我使用:
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
// Create a processing tap for the input parameters
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
MTAudioProcessingTapRef tap;
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err || !tap) {
NSLog(@"Unable to create the Audio Processing Tap");
return;
}
inputParams.audioTapProcessor = tap;
// Create a new AVAudioMix and assign it to our AVPlayerItem
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = @[inputParams];
playerItem.audioMix = audioMix;
此致 NIEK
答案 0 :(得分:0)
不幸的是,根据我的经验,由于在Apple TV上进行播放,因此iOS设备没有任何信息,因此无法在Airplay期间获取有关音频/视频的信息。
我在从timedMetaData
中获取SMPTE字幕数据时遇到了同样的问题,该问题在Airplay播放过程中停止了报告。
答案 1 :(得分:-2)
这里是解决方案:
这是为了实现AirPlay,我只将此代码用于我的应用程序上的音频我不知道你是否可以改进视频,但你可以试试;)
在 AppDelegate.m :
上- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
[RADStyle applyStyle];
[radiosound superclass];
[self downloadZip];
NSError *sessionError = nil;
[[AVAudioSession sharedInstance] setDelegate:self];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:&sessionError];
[[AVAudioSession sharedInstance] setActive:YES error:nil];
UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
}
如果你使用airplay来实现LockScreen控件,ArtWork,停止/播放,标题ecc。
在你玩家的DetailViewController中使用以下代码:
- (BOOL)canBecomeFirstResponder {
return YES;
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
NSData* imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString: (self.saved)[@"image"]]];
if (imageData == nil){
MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:@"lockScreen.png"]];
infoCenter.nowPlayingInfo = @{MPMediaItemPropertyTitle: saved[@"web"],MPMediaItemPropertyArtist: saved[@"title"], MPMediaItemPropertyArtwork:albumArt};
} else {
MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageWithData:imageData]];
infoCenter.nowPlayingInfo = @{MPMediaItemPropertyTitle: saved[@"link"],MPMediaItemPropertyArtist: saved[@"title"], MPMediaItemPropertyArtwork:albumArt};
}
}
希望此代码可以帮助您;)