使用AudioRecorder进行流式传输和使用MediaRecorder进行录制不允许使用麦克风资源共享

时间:2018-01-29 10:02:39

标签: android mediarecorder audiotrack android-audiorecord

我正在尝试使用MediaRecorder在Android中录制音频文件,同时使用AudioRecorder将录制的音频流传输到音频轨道,但我认为Android不允许这两个类使用Microphone资源。同一时间。

以下是我的MediaRecorder功能

的示例
private void recordAudio() {
    if (mMediaRecorder != null){
        mMediaRecorder.release();
        mMediaRecorder = null;
    }
    mMediaRecorder = new MediaRecorder();
    mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
    mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
    mMediaRecorder.setOutputFile(path);
    mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
    mMediaRecorder.setAudioEncodingBitRate(16);
    mMediaRecorder.setAudioSamplingRate(44100);
    mMediaRecorder.setAudioChannels(1);

    try {
        mMediaRecorder.prepare();
    } catch (IOException e) {
        e.printStackTrace();
    }
    mMediaRecorder.start();
}

这是我的麦克风流媒体类

public class MicrophoneStream {
private static final String TAG = "MicrophoneStream";
private AudioRecord mAudioRecord;
private AudioTrack mAudioTrack;
private Thread streamThread;
private boolean isRecording;
private int minBuffer;
private int sampleRate = 16000;
//Different file sample rate ---> 8000, 11025, 16000, 22050, 44100

public MicrophoneStream() {
}

private void stream(){
    if (mAudioTrack != null) {
        mAudioTrack.release();
        mAudioTrack = null;
    }
    if (mAudioRecord != null) {
        mAudioRecord.release();
        mAudioRecord = null;
    }


    minBuffer = AudioRecord.getMinBufferSize(sampleRate,
            AudioFormat.CHANNEL_IN_MONO,
            AudioFormat.ENCODING_PCM_16BIT);

    mAudioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
            sampleRate,
            AudioFormat.CHANNEL_IN_MONO,
            AudioFormat.ENCODING_PCM_16BIT, minBuffer);

    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
        mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
                sampleRate,
                AudioFormat.CHANNEL_OUT_MONO,
                AudioFormat.ENCODING_PCM_16BIT,
                minBuffer,
                AudioTrack.PERFORMANCE_MODE_LOW_LATENCY);
    } else {
        mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
                sampleRate,
                AudioFormat.CHANNEL_OUT_MONO,
                AudioFormat.ENCODING_PCM_16BIT,
                minBuffer,
                AudioTrack.MODE_STREAM);
    }

    mAudioTrack.setPlaybackRate(sampleRate);

    try {
        mAudioRecord.startRecording();
    } catch (IllegalStateException e) {
        e.printStackTrace();
    }

    mAudioTrack.play();

    byte[] buffer = new byte[minBuffer];

    while (isRecording) {
        mAudioRecord.read(buffer, 0, buffer.length);
        //Log.d(TAG, "audioStreamer: " + buffer);
        mAudioTrack.write(buffer, 0, buffer.length);
    }
}

public void startStream() {
    //for mic streaming
    isRecording = true;
    streamThread = new Thread(new Runnable() {
        @Override
        public void run() {
            stream();
        }
    });
    streamThread.start();
}

public void stopStream() {
    isRecording = false;
    streamThread = null;
}
}

这个函数的这个功能都可以正常工作但是当我合并它时,只有一个函数可以工作。

当我首先执行startStream函数时,开始流式传输然后我得到了这个错误:

FATAL EXCEPTION: main
                                                            Process: com.tabz.swolo, PID: 13628
                                                            java.lang.IllegalStateException
                                                                at android.media.MediaRecorder._start(Native Method)
                                                                at android.media.MediaRecorder.start(MediaRecorder.java:873)
                                                                at com.tabz.swolo.MergeRecordingActivity.recordAudio(MergeRecordingActivity.java:94)
                                                                at com.tabz.swolo.MergeRecordingActivity.access$100(MergeRecordingActivity.java:25)
                                                                at com.tabz.swolo.MergeRecordingActivity$1.onGrant(MergeRecordingActivity.java:59)
                                                                at com.tabz.swolo.BASE_CLASS.isPermissionGranted(BASE_CLASS.java:30)
                                                                at com.tabz.swolo.MergeRecordingActivity.onClick(MergeRecordingActivity.java:55)
                                                                at android.view.View.performClick(View.java:5225)
                                                                at android.view.View$PerformClick.run(View.java:21195)
                                                                at android.os.Handler.handleCallback(Handler.java:739)
                                                                at android.os.Handler.dispatchMessage(Handler.java:95)
                                                                at android.os.Looper.loop(Looper.java:148)
                                                                at android.app.ActivityThread.main(ActivityThread.java:5451)
                                                                at java.lang.reflect.Method.invoke(Native Method)
                                                                at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:726)
                                                                at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)

当我首先执行录制功能时,我收到了此错误,没有麦克风流,但录制正常。

 E/AudioRecord: start() status -38

1 个答案:

答案 0 :(得分:0)

我设法解决这个问题,了解android不允许MediaRecorder和AudioRecorder同时使用麦克风资源,所以在我的情况下我尝试搜索开源库,然后通过分析我想出了这个库。

  

https://github.com/kailash09dabhi/OmRecorder

尽管这个库并没有给我提供我试图在我的问题中实现的内容,但我尝试使用由其中一个库函数生成的AudioChunk来实现在录制时流式传输到AudioTrack。

private Recorder recorder;
private AudioTrack mAudioTrack;
private int[] rate = {8000, 11025, 16000, 22050, 44100};
private int sampleRate = rate[4];

private void setupRecorder() {
    if (recorder != null){
        recorder = null;
    }

    int minBuffer = AudioRecord.getMinBufferSize(sampleRate,
            AudioFormat.CHANNEL_IN_MONO,
            AudioFormat.ENCODING_PCM_16BIT);

    mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
            sampleRate,
            AudioFormat.CHANNEL_OUT_MONO,
            AudioFormat.ENCODING_PCM_16BIT,
            minBuffer,
            AudioTrack.MODE_STREAM);

    mAudioTrack.setPlaybackRate(sampleRate);

    mAudioTrack.play();

    recorder = OmRecorder.wav(
            new PullTransport.Default(mic(),
                    new PullTransport.OnAudioChunkPulledListener() {
                        @Override
                        public void onAudioChunkPulled(AudioChunk audioChunk) {
                            mAudioTrack.write(audioChunk.toBytes(), 0, audioChunk.toBytes().length);
                        }
                    }),
            file());
}

归功于:Kailash Dabhi

相关问题