我正在尝试在我的应用程序中实现一些简单的录音功能,我无法理解这样做。我被指向this example,但我不能让它在XCode中运行,它似乎是用C ++编写的。
我需要做的是将音频录制到文件中,然后能够在录制时获取录制的当前时间戳。我很感激任何帮助。谢谢!
答案 0 :(得分:5)
您可以使用AVFoundation
框架录制和播放音频。首先,您需要在.h
文件中实现此功能,并在xcode项目设置中添加框架或库,如下所示:
添加到您的项目设置后,将AVFoundation
导入.h
文件中,如下所示:
#import <AVFoundation/AVFoundation.h>
现在在.h
文件中实施您的代理:
@interface ViewController : UIViewController <AVAudioRecorderDelegate, AVAudioPlayerDelegate>
在此之后,在您的AVAudioRecorder
文件中声明AVAudioPlayer
和.h
,如下所示:
@interface ViewController () {
AVAudioRecorder *recorder;
AVAudioPlayer *player;
IBOutlet UIButton *stopButton;
IBOutlet UIButton *playButton ;
}
- (IBAction)recordPauseTapped:(id)sender;
- (IBAction)stopTapped:(id)sender;
- (IBAction)playTapped:(id)sender;
现在在-(Void)ViewDidLoad{}
:
- (void)viewDidLoad
{
[super viewDidLoad];
// Disable Stop/Play button when application launches
[stopButton setEnabled:NO];
[playButton setEnabled:NO];
// Set the audio file
NSArray *pathComponents = [NSArray arrayWithObjects:
[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject],
@"MyAudioMemo.m4a",
nil];
NSURL *outputFileURL = [NSURL fileURLWithPathComponents:pathComponents];
// Setup audio session
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
// Define the recorder setting
NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue:[NSNumber numberWithInt:kAudioFormatMPEG4AAC] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
// Initiate and prepare the recorder
recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:NULL];
recorder.delegate = self;
recorder.meteringEnabled = YES;
[recorder prepareToRecord];
}
现在实现录制按钮就像这样...
- (IBAction)recordPauseTapped:(id)sender {
// Stop the audio player before recording
if (player.playing) {
[player stop];
}
if (!recorder.recording) {
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setActive:YES error:nil];
// Start recording
[recorder record];
[recordPauseButton setTitle:@"Pause" forState:UIControlStateNormal];
} else {
// Pause recording
[recorder pause];
[recordPauseButton setTitle:@"Record" forState:UIControlStateNormal];
}
[stopButton setEnabled:YES];
[playButton setEnabled:NO];
}
现在实施StopButton
IBAction:
- (IBAction)stopTapped:(id)sender {
[recorder stop];
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setActive:NO error:nil];
}
Next实现playTapped
IBAction,如下所示:
- (IBAction)playTapped:(id)sender {
if (!recorder.recording){
player = [[AVAudioPlayer alloc] initWithContentsOfURL:recorder.url error:nil];
[player setDelegate:self];
[player play];
}
}
最后通过执行以下操作来实施所需的AVPlayer Delegate
:
- (IBAction)playTapped:(id)sender {
if (!recorder.recording){
player = [[AVAudioPlayer alloc] initWithContentsOfURL:recorder.url error:nil];
[player setDelegate:self];
[player play];
}
}
就是这样!成品应该看起来像这样......
有关详细信息,请查看以下链接:
文档链接:
希望这有助于。
答案 1 :(得分:1)
上面提到的AVFoundation框架仅限iOS。在OS X上处理音频在开始时非常痛苦。 CoreAudio虽然是我工作过的最好的组件之一,但确实需要一些时间来学习和理解。您可能需要考虑使用https://github.com/syedhali/EZAudio来完成任务。