为MP4文件添加额外的音轨

时间:2015-12-30 14:39:10

标签: android audio video mp4

我正在尝试为Android中的现有mp4文件添加额外的音轨。 我似乎无法弄清楚如何轻松地向文件添加第二首曲目。我尝试过MP4Parser,但是它覆盖了MP4文件的现有声音。我也尝试过使用FFMPEG,但重新编译音轨需要很长时间。

我添加了我的iOS实现。有没有办法可以在Android中以相同的方式实现?

    NSURL* audio_inputFileUrl = [NSURL fileURLWithPath:audioFileName];
    NSURL* video_inputFileUrl = [NSURL fileURLWithPath:inputVideo];

    self.outputFilePath = [[NSString alloc] initWithFormat:@"%@%@%@%@", NSTemporaryDirectory(), @"Dankie", [NSDate date], @".mp4"];
    NSURL* outputFileUrl = [NSURL fileURLWithPath:self.outputFilePath];

    // Create composition
    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    // Create Asset for inputVideo
    CMTime nextClipStartTime = kCMTimeZero;
    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];

    // Add VideoTrack of inputVideo to composition
    NSArray*       videoAssetTracks2 = [videoAsset tracksWithMediaType:AVMediaTypeVideo];
    AVAssetTrack*  videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
    CMTimeRange    video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);

    AVMutableCompositionTrack* a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

    CGAffineTransform rotationTransform = CGAffineTransformMakeRotation(M_PI_2);
    a_compositionVideoTrack.preferredTransform = rotationTransform;

    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:videoAssetTrack2 atTime:nextClipStartTime error:nil];

    // Add AudioTrack of inputVideo to composition
    NSArray*        audioAssetTracks2 = [videoAsset tracksWithMediaType:AVMediaTypeAudio];
    AVAssetTrack*   audioAssetTrack2 = ([audioAssetTracks2 count] > 0 ? [audioAssetTracks2 objectAtIndex:0] : nil);
    AVMutableCompositionTrack* a_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [a_compositionAudioTrack insertTimeRange:video_timeRange ofTrack:audioAssetTrack2 atTime:nextClipStartTime error:nil];

    // Create Asset for audio (song)
    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];

    // Add Audio of song to composition
    NSArray* audioAssetTracks = [audioAsset tracksWithMediaType:AVMediaTypeAudio];
    AVAssetTrack* audioAssetTrack = ([audioAssetTracks count] > 0 ? [audioAssetTracks objectAtIndex:0] : nil);

    AVMutableCompositionTrack* b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [b_compositionAudioTrack insertTimeRange:video_timeRange ofTrack:audioAssetTrack atTime:nextClipStartTime error:nil];



    // Export composition to videoFile
    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
    _assetExport.outputFileType = AVFileTypeQuickTimeMovie;
    _assetExport.outputURL = outputFileUrl;
    [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         if (AVAssetExportSessionStatusCompleted == _assetExport.status) {
             [self performSelectorOnMainThread:@selector(videoIsDone) withObject:nil waitUntilDone:YES];
         }
     }
     ];

0 个答案:

没有答案