如何在iphone SDK上添加音频到视频文件

时间:2010-08-09 13:05:21

标签: iphone audio video-encoding

我有一个视频文件和一个音频文件。是否可以将其与声音文件合并为一个视频。我认为AVMutableComposition应该对我有所帮助,但我仍然不明白如何。任何建议?

3 个答案:

答案 0 :(得分:19)

谢谢丹尼尔。我想通了,很容易。

AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audioUrl options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:videoUrl options:nil];

AVMutableComposition* mixComposition = [AVMutableComposition composition];

AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio 
                                                                                    preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) 
                                    ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] 
                                     atTime:kCMTimeZero error:nil];

AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo 
                                                                                    preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 
                               ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] 
                                atTime:kCMTimeZero error:nil];

AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition 
                                                                      presetName:AVAssetExportPresetPassthrough];   

NSString* videoName = @"export.mov";

NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSURL    *exportUrl = [NSURL fileURLWithPath:exportPath];

if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) 
{
    [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}

_assetExport.outputFileType = @"com.apple.quicktime-movie";
DLog(@"file type %@",_assetExport.outputFileType);
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;

[_assetExport exportAsynchronouslyWithCompletionHandler:
 ^(void ) {      
            // your completion code here
     }       
 }
 ];

答案 1 :(得分:6)

是的,这是可能的,这里是用于向现有合成添加音频的代码片段,我从苹果示例代码中抓取这个,您应该查看整个项目,你会发现它非常有用,项目是AVEditDemo,您可以在他们发布的WWDC 2010资料中找到它developer.apple.com/videos/wwdc/2010。希望有所帮助

 - (void)addCommentaryTrackToComposition:(AVMutableComposition *)composition withAudioMix:(AVMutableAudioMix *)audioMix

{

NSInteger i;

NSArray *tracksToDuck = [composition tracksWithMediaType:AVMediaTypeAudio]; // before we add the commentary



// Clip commentary duration to composition duration.

CMTimeRange commentaryTimeRange = CMTimeRangeMake(self.commentaryStartTime, self.commentary.duration);

if (CMTIME_COMPARE_INLINE(CMTimeRangeGetEnd(commentaryTimeRange), >, [composition duration]))

    commentaryTimeRange.duration = CMTimeSubtract([composition duration], commentaryTimeRange.start);



// Add the commentary track.

AVMutableCompositionTrack *compositionCommentaryTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, commentaryTimeRange.duration) ofTrack:[[self.commentary tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:commentaryTimeRange.start error:nil];





NSMutableArray *trackMixArray = [NSMutableArray array];

CMTime rampDuration = CMTimeMake(1, 2); // half-second ramps

for (i = 0; i < [tracksToDuck count]; i++) {

    AVMutableAudioMixInputParameters *trackMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[tracksToDuck objectAtIndex:i]];

    [trackMix setVolumeRampFromStartVolume:1.0 toEndVolume:0.2 timeRange:CMTimeRangeMake(CMTimeSubtract(commentaryTimeRange.start, rampDuration), rampDuration)];

    [trackMix setVolumeRampFromStartVolume:0.2 toEndVolume:1.0 timeRange:CMTimeRangeMake(CMTimeRangeGetEnd(commentaryTimeRange), rampDuration)];

    [trackMixArray addObject:trackMix];

}

audioMix.inputParameters = trackMixArray;

}

答案 2 :(得分:0)

这是swift版本:

    func mixAudio(audioURL audioURL: NSURL, videoURL: NSURL) {
    let audioAsset = AVURLAsset(URL: audioURL)
    let videoAsset = AVURLAsset(URL: videoURL)

    let mixComposition = AVMutableComposition()

    let compositionCommentaryTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)

    // add audio
    let timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration)
    let track = audioAsset.tracksWithMediaType(AVMediaTypeAudio)[0]
    do {
        try compositionCommentaryTrack.insertTimeRange(timeRange, ofTrack: track, atTime: kCMTimeZero)
    }
    catch {
        print("Error insertTimeRange for audio track \(error)")
    }

    // add video
    let compositionVideoTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)

    let timeRangeVideo = CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
    let trackVideo = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]
    do {
        try compositionVideoTrack.insertTimeRange(timeRangeVideo, ofTrack: trackVideo, atTime: kCMTimeZero)
    }
    catch {
        print("Error insertTimeRange for video track \(error)")
    }

    // export
    let assetExportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough)
    let videoName = "export.mov"
    exportPath = "\(NSTemporaryDirectory())/\(videoName)"
    let exportURL = NSURL(fileURLWithPath: exportPath!)

    if NSFileManager.defaultManager().fileExistsAtPath(exportPath!) {
        do {
            try NSFileManager.defaultManager().removeItemAtPath(exportPath!)
        }
        catch {
            print("Error deleting export.mov: \(error)")
        }
    }

    assetExportSession?.outputFileType = "com.apple.quicktime-movie"
    assetExportSession?.outputURL = exportURL
    assetExportSession?.shouldOptimizeForNetworkUse = true
    assetExportSession?.exportAsynchronouslyWithCompletionHandler({ 
        print("Mixed audio and video!")
        dispatch_async(dispatch_get_main_queue(), {
            print(self.exportPath!)

        })
    })

}