我需要使用AVAsset对象,以便使用AVPlayer和AVPlayerLayer播放它。我开始使用Photos框架,因为不推荐使用AssetsLibrary。现在我到了我有一个PHAsset对象数组的点,我需要将它们转换为AVAsset。我尝试通过PHFetchResult枚举并使用PHAsset的本地化描述分配新的AVAsset,但是当我播放时它似乎没有显示任何视频。
PHAssetCollection *assetColl = [self scaryVideosAlbum];
PHFetchResult *getVideos = [PHAsset fetchAssetsInAssetCollection:assetColl options:nil];
[getVideos enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
NSURL *videoUrl = [NSURL URLWithString:asset.localizedDescription];
AVAsset *avasset = [AVAsset assetWithURL:videoUrl];
[tempArr addObject:avasset];
}];
我认为本地化描述不是视频的绝对网址。
我也偶然发现了PHImageManager和requestAVAssetForVideo,但是当视频归结为视频时,options参数没有isSynchrounous属性,就像image options参数一样。
PHVideoRequestOptions *option = [PHVideoRequestOptions new];
[[PHImageManager defaultManager] requestAVAssetForVideo:videoAsset options:option resultHandler:^(AVAsset * _Nullable avasset, AVAudioMix * _Nullable audioMix, NSDictionary * _Nullable info) {
有同步方法吗?
感谢。
答案 0 :(得分:25)
不,没有。但是可以构建同步版本:
dispatch_semaphore_t semaphore = dispatch_semaphore_create(0);
PHVideoRequestOptions *option = [PHVideoRequestOptions new];
__block AVAsset *resultAsset;
[[PHImageManager defaultManager] requestAVAssetForVideo:videoAsset options:option resultHandler:^(AVAsset * avasset, AVAudioMix * audioMix, NSDictionary * info) {
resultAsset = avasset;
dispatch_semaphore_signal(semaphore);
}];
dispatch_semaphore_wait(semaphore, DISPATCH_TIME_FOREVER);
// yay, we synchronously have the asset
[self doSomethingWithAsset:resultAsset];
但是,如果您在主线程上执行此操作并且requestAVAssetForVideo:
花费的时间过长,则可能会锁定您的用户界面甚至被iOS watchdog终止。
使用异步回调版本重写您的应用程序可能更安全。像这样:
__weak __typeof(self) weakSelf = self;
[[PHImageManager defaultManager] requestAVAssetForVideo:videoAsset options:option resultHandler:^(AVAsset * avasset, AVAudioMix * audioMix, NSDictionary * info) {
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf doSomethingWithAsset:avasset];
});
}];
答案 1 :(得分:6)
对于 Swift 2 ,您可以使用以下方法轻松地使用PHAsset
播放视频,
import AVKit
static func playVideo (view:UIViewController, asset:PHAsset) {
guard (asset.mediaType == PHAssetMediaType.Video)
else {
print("Not a valid video media type")
return
}
PHCachingImageManager().requestAVAssetForVideo(asset, options: nil, resultHandler: {(asset: AVAsset?, audioMix: AVAudioMix?, info: [NSObject : AnyObject]?) in
let asset = asset as! AVURLAsset
dispatch_async(dispatch_get_main_queue(), {
let player = AVPlayer(URL: asset.URL)
let playerViewController = AVPlayerViewController()
playerViewController.player = player
view.presentViewController(playerViewController, animated: true) {
playerViewController.player!.play()
}
})
})
}
答案 2 :(得分:0)
你可以试试这个技巧但是当你想要转换为AVAsset的3,4或5个phasset时它很方便:
[[PHImageManager defaultManager] requestAVAssetForVideo:assetsArray[0] options:option resultHandler:^(AVAsset * avasset, AVAudioMix * audioMix, NSDictionary * info) {
//do something with this asset
[[PHImageManager defaultManager] requestAVAssetForVideo:assetsArray[1] options:option resultHandler:^(AVAsset * avasset, AVAudioMix * audioMix, NSDictionary * info) {
//so on...
}
}
所以基本上,当你将1个phasset转换为AVAsset时,你可以再次调用这个方法。我知道这可能不是一个有效的代码,但不应该为了很少的目的而禁止它。
答案 3 :(得分:0)
以下是Swift 4实现,它依赖于信号量来同步发出请求。
对代码进行了评论,以解释各个步骤。
func requestAVAsset(asset: PHAsset) -> AVAsset? {
// We only want videos here
guard asset.mediaType == .video else { return nil }
// Create your semaphore and allow only one thread to access it
let semaphore = DispatchSemaphore.init(value: 1)
let imageManager = PHImageManager()
var avAsset: AVAsset?
// Lock the thread with the wait() command
semaphore.wait()
// Now go fetch the AVAsset for the given PHAsset
imageManager.requestAVAsset(forVideo: asset, options: nil) { (asset, _, _) in
// Save your asset to the earlier place holder
avAsset = asset
// We're done, let the semaphore know it can unlock now
semaphore.signal()
}
return avAsset
}
答案 4 :(得分:0)
您可以使用此:
let asset = info[UIImagePickerControllerPHAsset] as? PHAsset
PHCachingImageManager().requestAVAsset(forVideo: phAsset, options: nil) { (avAsset, _, _) in
if let avAsset = avAsset {
print(avAsset)
}
}