在iOS中播放视频会看到视频周围的奇怪绿线

时间:2015-04-08 03:53:34

标签: ios iphone video camera

嘿大家我正在裁剪从iPhone上的相机拍摄的视频然后裁剪它像这样播放它。当我这样做的时候,我在视频的底部和右侧有一条奇怪的绿线?不确定为什么会发生这种情况或如何解决它。这是我如何裁剪。

- (UIImageOrientation)getVideoOrientationFromAsset:(AVAsset *)asset
{
    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    CGSize size = [videoTrack naturalSize];
    CGAffineTransform txf = [videoTrack preferredTransform];

    if (size.width == txf.tx && size.height == txf.ty)
        return UIImageOrientationLeft; //return UIInterfaceOrientationLandscapeLeft;
    else if (txf.tx == 0 && txf.ty == 0)
        return UIImageOrientationRight; //return UIInterfaceOrientationLandscapeRight;
    else if (txf.tx == 0 && txf.ty == size.width)
        return UIImageOrientationDown; //return UIInterfaceOrientationPortraitUpsideDown;
    else
        return UIImageOrientationUp;  //return UIInterfaceOrientationPortrait;
}


- (AVAssetExportSession*)applyCropToVideoWithAsset:(AVAsset*)asset AtRect:(CGRect)cropRect OnTimeRange:(CMTimeRange)cropTimeRange ExportToUrl:(NSURL*)outputUrl ExistingExportSession:(AVAssetExportSession*)exporter WithCompletion:(void(^)(BOOL success, NSError* error, NSURL* videoUrl))completion
{

//    NSLog(@"CALLED");
    //create an avassetrack with our asset
    AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

//create a video composition and preset some settings
AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);

CGFloat cropOffX = cropRect.origin.x;
CGFloat cropOffY = cropRect.origin.y;
CGFloat cropWidth = cropRect.size.width;
CGFloat cropHeight = cropRect.size.height;
//    NSLog(@"width: %f - height: %f - x: %f - y: %f", cropWidth, cropHeight, cropOffX, cropOffY);

videoComposition.renderSize = CGSizeMake(cropWidth, cropHeight);

//create a video instruction
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = cropTimeRange;

AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];

UIImageOrientation videoOrientation = [self getVideoOrientationFromAsset:asset];

CGAffineTransform t1 = CGAffineTransformIdentity;
CGAffineTransform t2 = CGAffineTransformIdentity;

switch (videoOrientation) {
    case UIImageOrientationUp:
        t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height - cropOffX, 0 - cropOffY );
        t2 = CGAffineTransformRotate(t1, M_PI_2 );
        break;
    case UIImageOrientationDown:
        t1 = CGAffineTransformMakeTranslation(0 - cropOffX, clipVideoTrack.naturalSize.width - cropOffY ); // not fixed width is the real height in upside down
        t2 = CGAffineTransformRotate(t1, - M_PI_2 );
        break;
    case UIImageOrientationRight:
        t1 = CGAffineTransformMakeTranslation(0 - cropOffX, 0 - cropOffY );
        t2 = CGAffineTransformRotate(t1, 0 );
        break;
    case UIImageOrientationLeft:
        t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.width - cropOffX, clipVideoTrack.naturalSize.height - cropOffY );
        t2 = CGAffineTransformRotate(t1, M_PI  );
        break;
    default:
        NSLog(@"no supported orientation has been found in this video");
        break;
}

CGAffineTransform finalTransform = t2;
[transformer setTransform:finalTransform atTime:kCMTimeZero];

//add the transformer layer instructions, then add to video composition
instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];

//Remove any prevouis videos at that path
[[NSFileManager defaultManager]  removeItemAtURL:outputUrl error:nil];

if (!exporter){
    exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ;
}
// assign all instruction for the video processing (in this case the transformation for cropping the video
exporter.videoComposition = videoComposition;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
if (outputUrl){
    exporter.outputURL = outputUrl;
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        switch ([exporter status]) {
            case AVAssetExportSessionStatusFailed:
                NSLog(@"crop Export failed: %@", [[exporter error] localizedDescription]);
                if (completion){
                    dispatch_async(dispatch_get_main_queue(), ^{
                        completion(NO,[exporter error],nil);
                    });
                    return;
                }
                break;
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"crop Export canceled");
                if (completion){
                    dispatch_async(dispatch_get_main_queue(), ^{
                        completion(NO,nil,nil);
                    });
                    return;
                }
                break;
            default:
                break;
        }
        if (completion){
            dispatch_async(dispatch_get_main_queue(), ^{
                completion(YES,nil,outputUrl);
            });
        }

    }];
}

return exporter;
}

然后我就这样玩并调用这种作物。

AVAsset *assest = [AVAsset assetWithURL:self.videoURL];


NSString * documentsPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *exportPath = [documentsPath stringByAppendingFormat:@"/croppedvideo.mp4"];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];


AVAssetExportSession *exporter = [AVAssetExportSession exportSessionWithAsset:assest presetName:AVAssetExportPresetLowQuality];


[self applyCropToVideoWithAsset:assest AtRect:CGRectMake(self.view.frame.size.width/2 - 57.5 - 5, self.view.frame.size.height / 2 - 140, 115, 85) OnTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(assest.duration.value, 1))
                    ExportToUrl:exportUrl ExistingExportSession:exporter WithCompletion:^(BOOL success, NSError *error, NSURL *videoUrl) {
AVPlayer *player = [AVPlayer playerWithURL:videoUrl];

AVPlayerLayer *layer = [AVPlayerLayer playerLayerWithPlayer:player];
layer.frame = CGRectMake(125, 365, 115, 115);
UIView *view = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 400, 400)];
[view.layer addSublayer:layer];
[self.view addSubview:view];
[player play];

如果你想测试一下,只需要复制和粘贴代码然后设置一个视频,你就会看到我在说什么。

感谢您抽出宝贵时间来帮助我,我知道这是一段相当多的代码。

2 个答案:

答案 0 :(得分:9)

iOS编码器或视频格式本身有宽度要求。尝试使宽度均匀或可被4整除。

我不知道对身高有类似的要求,但这也值得一试。

我从未发现它有记录,但要求均匀度有一定意义,因为h.264使用4:2:0 yuv颜色空间,其中UV分量是一半大小(两个维度) Y频道,具有视频的整体尺寸。如果这些尺寸不均匀,则UV尺寸不会是整数。

P.S。在这些情况下的提示是神秘的绿色。我认为它对应于YUV中的0,0,0。

答案 1 :(得分:5)

@Rhythmic的答案救了我的一天。

在我的应用中,我需要根据屏幕宽度大小的方形视频。因此对于iPhone 5来说这是320像素,而对于iPhone 6来说,它变成了375像素。

  

所以我遇到了iPhone 6尺寸分辨率相同的绿线问题。   因为它的屏幕尺寸宽度是375像素。并且不能被2或者整除   4。

为此,我们做了这些改变

 AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
    MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
    MainInstruction.timeRange = range;

    MainCompositionInst.frameDuration = VideoFrameDuration; //Constants
    MainCompositionInst.renderScale = VideoRenderScale; //Constants

    if ((int)SCREEN_WIDTH % 2 == 0)
        MainCompositionInst.renderSize = CGSizeMake(SCREEN_WIDTH, SCREEN_WIDTH);
    else // This does the trick
        MainCompositionInst.renderSize = CGSizeMake(SCREEN_WIDTH+1, SCREEN_WIDTH+1); 

希望这会有所帮助。只需再添加一个像素,使其可以被2或4整除。

由于