在iOS中作为背景的视频最有效的方法

时间:2014-10-26 09:59:31

标签: ios objective-c iphone user-interface video

也许您已经注意到iOS应用程序中的最新趋势之一:使用视频作为背景 - 主要是在登录 - 或者"首次发布"屏幕。昨天我试图通过一个非常简单的测试项目(只有一个视图控制器)模仿这个,我对结果感到满意,除了性能。在iOS模拟器中尝试(在模拟的iPhone 6上)时, CPU使用率 70-110%之间波动。对于简单的登录屏幕来说,这似乎是非常不合理的。

这就是它的实际效果: http://oi57.tinypic.com/nqqntv.jpg

问题是:是否有更有效的CPU方法来实现这一目标? Vine,Spotify和Instagram等应用程序如何做到这一点?

回答之前;我使用的方法是使用MPMoviePlayerController播放的全高清视频:

- (void)viewDidLoad {
    [super viewDidLoad];

    // find movie file
    NSString *moviePath = [[NSBundle mainBundle] pathForResource:@"arenaVideo" ofType:@"mp4"];
    NSURL *movieURL = [NSURL fileURLWithPath:moviePath];

    // load movie
    self.moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:movieURL];
    self.moviePlayer.controlStyle = MPMovieControlStyleNone;
    self.moviePlayer.view.frame = self.view.frame;
    self.moviePlayer.scalingMode = MPMovieScalingModeAspectFill;
    [self.view addSubview:self.moviePlayer.view];
    [self.view sendSubviewToBack:self.moviePlayer.view];
    [self.moviePlayer play];

    // loop movie
    [[NSNotificationCenter defaultCenter] addObserver: self
                                             selector: @selector(replayMovie:)
                                                 name: MPMoviePlayerPlaybackDidFinishNotification
                                               object: self.moviePlayer];
}

#pragma mark - Helper methods

-(void)replayMovie:(NSNotification *)notification
{
    [self.moviePlayer play];
}

当然,视频的边缘可能已被修剪,因此分辨率将更像是700x1080而不是1920x1080,但这会在性能方面产生巨大差异吗?或者我应该使用特定格式和设置压缩视频以获得最佳性能?也许对此有一种完全不同的方法?

实际上我尝试使用本文所述的GIF:https://medium.com/swift-programming/ios-make-an-awesome-video-background-view-objective-c-swift-318e1d71d0a2

问题在于:

  • 从视频中创建GIF需要花费大量的时间和精力
  • 我在尝试时看到CPU使用率没有显着下降
  • 使用这种方法支持多种屏幕尺寸是一件非常痛苦的事情(至少在我尝试的时候 - 启用了Autolayout和Size Classes - 我无法让GIF在各设备之间正确扩展)
  • 视频质量不佳

5 个答案:

答案 0 :(得分:20)

最好的方法是使用AVFoundation然后控制视频图层

在头文件中声明@property (nonatomic, strong) AVPlayerLayer *playerLayer;

- (void)viewDidLoad {
      [super viewDidLoad];


      [self.view.layer addSublayer:self.playerLayer];

      // loop movie
      [[NSNotificationCenter defaultCenter] addObserver: self
                                             selector: @selector(replayMovie:)
                                             name: AVPlayerItemDidPlayToEndTimeNotification 
                                             object:nil];
}
-(AVPlayerLayer*)playerLayer{
      if(!_playerLayer){

         // find movie file
         NSString *moviePath = [[NSBundle mainBundle] pathForResource:@"arenaVideo" ofType:@"mp4"];
         NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
         _playerLayer = [AVPlayerLayer playerLayerWithPlayer:[[AVPlayer alloc]initWithURL:movieURL]];
         _playerLayer.frame = CGRectMake(0,0,self.view.frame.size.width, self.view.frame.size.height);
         [_playerLayer.player play];

      }
    return _playerLayer
}
-(void)replayMovie:(NSNotification *)notification
{
    [self.playerLayer.player play];
}

Swift 2.0

lazy var playerLayer:AVPlayerLayer = {

    let player = AVPlayer(URL:  NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("LaunchMovie", ofType: "mov")!))
    player.muted = true
    player.allowsExternalPlayback = false
    player.appliesMediaSelectionCriteriaAutomatically = false
    var error:NSError?

    // This is needed so it would not cut off users audio (if listening to music etc.
    do {
        try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient)
    } catch var error1 as NSError {
        error = error1
    } catch {
        fatalError()
    }
    if error != nil {
        print(error)
    }

    var playerLayer = AVPlayerLayer(player: player)
    playerLayer.frame = self.view.frame
    playerLayer.videoGravity = "AVLayerVideoGravityResizeAspectFill"
    playerLayer.backgroundColor = UIColor.blackColor().CGColor
    player.play()
    NSNotificationCenter.defaultCenter().addObserver(self, selector:"playerDidReachEnd", name:AVPlayerItemDidPlayToEndTimeNotification, object:nil)
    return playerLayer
    }()

override func viewDidLoad() {
    super.viewDidLoad()
    self.view.layer.addSublayer(self.playerLayer)
}
override func viewWillDisappear(animated: Bool) {
    NSNotificationCenter.defaultCenter().removeObserver(self)
}
// If orientation changes
override func willAnimateRotationToInterfaceOrientation(toInterfaceOrientation: UIInterfaceOrientation, duration: NSTimeInterval) {
    playerLayer.frame = self.view.frame
}
func playerDidReachEnd(){
    self.playerLayer.player!.seekToTime(kCMTimeZero)
    self.playerLayer.player!.play()

}

在iOS7上测试 - iOS9

答案 1 :(得分:2)

我意识到这是一个老帖子,但由于我有一些降低iOS应用程序CPU使用率的经验,我会做出回应。

首先要看的是使用AVFoundationFramework

实施AVPlayer应该有助于降低CPU的性能

但最好的解决方案是使用Brad Larson的GPUImage库,它利用OpenGl并大大降低CPU使用率。下载库并有一些如何使用的示例。我建议使用GPUImageMovieWriter

答案 2 :(得分:1)

我在iOS8 / 9中找到了适合我的this code on GitHub

- (void)viewDidLoad {
    [super viewDidLoad];

    // Load the video from the app bundle.
    NSURL *videoURL = [[NSBundle mainBundle] URLForResource:@"video" withExtension:@"mov"];

    // Create and configure the movie player.
    self.moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];

    self.moviePlayer.controlStyle = MPMovieControlStyleNone;
    self.moviePlayer.scalingMode = MPMovieScalingModeAspectFill;

    self.moviePlayer.view.frame = self.view.frame;
    [self.view insertSubview:self.moviePlayer.view atIndex:0];

    [self.moviePlayer play];

    // Loop video.
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(loopVideo) name:MPMoviePlayerPlaybackDidFinishNotification object:self.moviePlayer];
}

- (void)loopVideo {
    [self.moviePlayer play];
}

答案 3 :(得分:1)

我使用AVAssetReader,GLKView并通过CIImage管道进行渲染。 在模拟器上播放没有过滤的视频时,它会占用大约80%的cpu。 在实际设备上,实时过滤(CIFilter)的成本为1x%。它也可以设置为循环并控制FPS。我已经在Github上制作了它并且欢迎任何人获得副本。对于不想仅仅为视频背景视图删除整个GPUImage的人来说,这将是一个不错的选择。 拖放视图,它可以工作。 https://github.com/matthewlui/FSVideoView

答案 4 :(得分:0)

对于iOS9,我使用了Andrius'代码并为循环添加以下内容:

-(void)replayBG:(NSNotification *)n {
    [playerLayer.player seekToTime:kCMTimeZero];
    [playerLayer.player play];
}