播放叠加的视频

时间:2015-11-29 01:24:22

标签: ios objective-c video avfoundation

我有多个imageview子视图根据我的传入数据进行堆叠。基本上所有这些子视图都根据我的输入数据设置为图像或视频图层。我遇到的问题是播放视频。我可以播放堆叠中的第一个视频,但之后的每个视频都只是第一个视频的声音。我怎么能相应地玩每一个?

通过像Snapchat这样的点击事件来浏览视图。见下文:

@interface SceneImageViewController ()

@property (strong, nonatomic) NSURL *videoUrl;
@property (strong, nonatomic) AVPlayer *avPlayer;
@property (strong, nonatomic) AVPlayerLayer *avPlayerLayer;

@end

@implementation SceneImageViewController

- (void)viewDidLoad {

[super viewDidLoad];

self.mySubviews = [[NSMutableArray alloc] init];
self.videoCounterTags = [[NSMutableArray alloc] init];

int c = (int)[self.scenes count];
c--;
NSLog(@"int c = %d", c);
self.myCounter = [NSNumber numberWithInt:c];


for (int i=0; i<=c; i++) {

    //create imageView
    UIImageView *imageView =[[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
    [imageView setUserInteractionEnabled:YES]; // <--- This is very important
    imageView.tag = i;                        // <--- Add tag to track this subview in the view stack
    [self.view addSubview:imageView];
    NSLog(@"added image view %d", i);


    //get scene object
    PFObject *sceneObject = self.scenes[i];


    //get the PFFile and filetype
    PFFile *file = [sceneObject objectForKey:@"file"];
    NSString *fileType = [sceneObject objectForKey:@"fileType"];



    //check the filetype
    if ([fileType  isEqual: @"image"])
    {
        dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
        //get image
        NSURL *imageFileUrl = [[NSURL alloc] initWithString:file.url];
        NSData *imageData = [NSData dataWithContentsOfURL:imageFileUrl];
            dispatch_async(dispatch_get_main_queue(), ^{
        imageView.image = [UIImage imageWithData:imageData];
            });
        });

    }

    //its a video
    else
    {
        // the video player
        NSURL *fileUrl = [NSURL URLWithString:file.url];

        self.avPlayer = [AVPlayer playerWithURL:fileUrl];
        self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;

        self.avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
        //self.avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

        [[NSNotificationCenter defaultCenter] addObserver:self
                                                 selector:@selector(playerItemDidReachEnd:)
                                                     name:AVPlayerItemDidPlayToEndTimeNotification
                                                   object:[self.avPlayer currentItem]];

        CGRect screenRect = [[UIScreen mainScreen] bounds];

        self.avPlayerLayer.frame = CGRectMake(0, 0, screenRect.size.width, screenRect.size.height);
        [imageView.layer addSublayer:self.avPlayerLayer];

        NSNumber *tag = [NSNumber numberWithInt:i+1];

        NSLog(@"tag = %@", tag);

        [self.videoCounterTags addObject:tag];

        //[self.avPlayer play];
    }



}



UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(viewTapped:)];

[self.view bringSubviewToFront:self.screen];

[self.screen addGestureRecognizer:tapGesture];


}


 - (void)viewTapped:(UIGestureRecognizer *)gesture{

NSLog(@"touch!");

[self.avPlayer pause];

int i = [self.myCounter intValue];
NSLog(@"counter = %d", i);



for(UIImageView *subview in [self.view subviews]) {

    if(subview.tag== i) {

        [subview removeFromSuperview];
    }
}

if ([self.videoCounterTags containsObject:self.myCounter]) {
    NSLog(@"play video!!!");
    [self.avPlayer play];
}

if (i == 0) {
    [self.avPlayer pause];
    [self.navigationController popViewControllerAnimated:NO];
}


i--;
self.myCounter = [NSNumber numberWithInt:i];


NSLog(@"counter after = %d", i);





}

2 个答案:

答案 0 :(得分:1)

看看你设置myCounter变量的方式。它被设置一次,在点击视图之前永远不会改变,然后将其设置为场景计数-1。

此外,尝试查看您设置为_avPlayer指针var的方式。它总是被设置,一遍又一遍,似乎在for循环中你想要存储引用,而不是简单地将相同的指针更新为场景集合中最新的值。

另外,来自Apple的documentation

  

您可以使用相同的AVPlayer对象创建任意数量的播放器图层。只有最近创建的播放器图层才会在屏幕上实际显示视频内容。

所以,有可能因为你使用相同的AVPlayer对象来创建所有这些AVPlayer图层,所以你永远不会看到任何一个以上的实际视频图层。

答案 1 :(得分:1)

Brooks Hanes所说的是正确的,你一直在压倒飞机。 这就是我建议你做的事情:

  1. 将点击手势添加到imageView而不是屏幕(或者使用UIButton代替更简洁的方法):

    UIImageView *imageView =[[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
    [imageView setUserInteractionEnabled:YES]; // <--- This is very important
    imageView.tag = i;                        // <--- Add tag to track this subview in the view stack
    [self.view addSubview:imageView];
    NSLog(@"added image view %d", i);
    UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:imageView action:@selector(viewTapped:)];
    [imageView addGestureRecognizer:tapGesture];
    
  2. 通过这种方式,您可以在viewTapped:方法中获取所按下图片的标记,如下所示:gesture.view.tag而不是使用myCounter

    1. 要让视频正常工作,您可以为每个视频创建一个新的AVPlayer,但这可能会使内存变得非常昂贵。更改视频时,更好的方法是使用AVPlayerItem并切换AVPlayer&#39; AVPlayerItem
    2. 所以在for循环中执行类似的操作self.videoFiles属于NSMutableDictionary属性:

                 // the video player
                  NSNumber *tag = [NSNumber numberWithInt:i+1];
                  NSURL *fileUrl = [NSURL URLWithString:file.url];
                //save your video file url paired with the ImageView it belongs to.
                 [self.videosFiles setObject:fileUrl forKey:tag];
      // you only need to initialize the player once.
                  if(self.avPlayer == nil){
                      AVAsset *asset = [AVAsset assetWithURL:fileUrl];
                      AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
                      self.avPlayer = [[AVPlayer alloc] initWithPlayerItem:item];
                      self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
                      [[NSNotificationCenter defaultCenter] addObserver:self
                          selector:@selector(playerItemDidReachEnd:)
                      name:AVPlayerItemDidPlayToEndTimeNotification
                      object:[self.avPlayer currentItem]];
                  }
                  // you don't need to keep the layer as a property 
                  // (unless you need it for some reason 
                  AVPlayerLayer* avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
                  avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
      
                  CGRect screenRect = [[UIScreen mainScreen] bounds];                
                  avPlayerLayer.frame = CGRectMake(0, 0, screenRect.size.width, screenRect.size.height);
                  [imageView.layer addSublayer:avPlayerLayer];
                  NSLog(@"tag = %@", tag);                
                  [self.videoCounterTags addObject:tag];
      

      现在在viewTapped

       if ([self.videoCounterTags containsObject:gesture.view.tag]) { 
      
        NSLog(@"play video!!!");
          AVAsset *asset = [AVAsset assetWithURL:[self.videoFiles objectForKey:gesture.view.tag]];
          AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
          self.avPlayer replaceCurrentItemWithPlayerItem: item];
          [self.avLayer play];
      }
      

      或者使用self.videoFiles,然后根本不需要self.videoCounterTags

       NSURL* fileURL = [self.videoFiles objectForKey:gesture.view.tag];
       if (fileURL!=nil) {    
           NSLog(@"play video!!!");
           AVAsset *asset = [AVAsset assetWithURL:fileURL];
           AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
           self.avPlayer replaceCurrentItemWithPlayerItem: item];
           [self.avLayer play];
       }
      

      这是它的要点。