音频播放失败 - E / android.media.AudioTrack:多声道配置中必须存在前声道

时间:2014-11-02 14:27:45

标签: android debugging audio audiorecord audiotrack

之前我没有使用过android的录音类,所以我对该领域的知识并不多。

我写了一个小应用程序,它将在后台录制音频,然后播放,所有都是PCM格式(我正在做一些测试,看看麦克风在后台使用多少电池)。 但是当我尝试运行我的play()方法时,我得到了logcat错误:

11-03 00:20:05.744  18248-18248/com.bacon.corey.audiotimeshift E/android.media.AudioTrack﹕ Front channels must be present in multichannel configurations
11-03 00:20:05.748  18248-18248/com.bacon.corey.audiotimeshift E/AudioTrack﹕ Playback Failed

我已经搜索了错误,但我似乎找不到任何关于它们的事情。

如果有人不介意给我一些指示,我将非常感激。

这是应用程序的代码(它非常草率和未完成,因为它仅用于测试电池寿命):

public class MainActivity extends ActionBarActivity {

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        if (savedInstanceState == null) {
            getSupportFragmentManager().beginTransaction()
                    .add(R.id.container, new PlaceholderFragment())
                    .commit();
        }

    }


    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        // Inflate the menu; this adds items to the action bar if it is present.
        getMenuInflater().inflate(R.menu.menu_main, menu);
        return true;
    }

    @Override
    public boolean onOptionsItemSelected(MenuItem item) {
        // Handle action bar item clicks here. The action bar will
        // automatically handle clicks on the Home/Up button, so long
        // as you specify a parent activity in AndroidManifest.xml.
        int id = item.getItemId();

        //noinspection SimplifiableIfStatement
        if (id == R.id.action_settings) {
            return true;
        }

        return super.onOptionsItemSelected(item);
    }

    /**
     * A placeholder fragment containing a simple view.
     */
    public static class PlaceholderFragment extends Fragment {

        public PlaceholderFragment() {
        }

        @Override
        public View onCreateView(LayoutInflater inflater, ViewGroup container,
                                 Bundle savedInstanceState) {
            View rootView = inflater.inflate(R.layout.fragment_main, container, false);
            return rootView;
        }
    }

    public void play(View view) {
        Toast.makeText(this, "play", Toast.LENGTH_SHORT).show();

// Get the file we want to playback.
        File file = new File(Environment.getExternalStorageDirectory() + File.separator + "ACS.pcm");
// Get the length of the audio stored in the file (16 bit so 2 bytes per short)
// and create a short array to store the recorded audio.
        int musicLength = (int)(file.length()/2);
        short[] music = new short[musicLength];


        try {
// Create a DataInputStream to read the audio data back from the saved file.
            InputStream is = new FileInputStream(file);
            BufferedInputStream bis = new BufferedInputStream(is);
            DataInputStream dis = new DataInputStream(bis);

// Read the file into the music array.
            int i = 0;
            while (dis.available() > 0) {
                music[musicLength-1-i] = dis.readShort();
                i++;
            }


// Close the input streams.
            dis.close();


// Create a new AudioTrack object using the same parameters as the AudioRecord
// object used to create the file.
            AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
                    11025,
                    AudioFormat.CHANNEL_OUT_MONO,
                    AudioFormat.ENCODING_PCM_16BIT,
                    musicLength,
                    AudioTrack.MODE_STREAM);
// Start playback
            audioTrack.play();

// Write the music buffer to the AudioTrack object
            audioTrack.write(music, 0, musicLength);


        } catch (Throwable t) {
            Log.e("AudioTrack","Playback Failed");
        }
    }

    public void record(View view){
        Toast.makeText(this, "record", Toast.LENGTH_SHORT).show();

        Log.v("ACS", "OnCreate called");

        Intent intent = new Intent(this, ACS.class);
        startService(intent);
    }
    public void stop(View view){
        Toast.makeText(this, "stop", Toast.LENGTH_SHORT).show();
        Intent intent = new Intent(this, ACS.class);
        stopService(intent);
    }

}

并且

public class ACS extends IntentService {

    AudioRecord audioRecord;
    public ACS() {
        super("ACS");
    }

    @Override
    protected void onHandleIntent(Intent intent) {
        Log.v("ACS", "ACS called");

        record();
    }

    public void record() {
        Log.v("ACS", "Record started");
        int frequency = 11025;
        int channelConfiguration = AudioFormat.CHANNEL_IN_MONO;
        int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
        File file = new File(Environment.getExternalStorageDirectory() + File.separator + "ACS.pcm");

// Delete any previous recording.
        if (file.exists())
            file.delete();


// Create the new file.
        try {
            file.createNewFile();
        } catch (IOException e) {
            throw new IllegalStateException("Failed to create " + file.toString());
        }

        try {
// Create a DataOuputStream to write the audio data into the saved file.
            OutputStream os = new FileOutputStream(file);
            BufferedOutputStream bos = new BufferedOutputStream(os);
            DataOutputStream dos = new DataOutputStream(bos);

// Create a new AudioRecord object to record the audio.
            int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
            audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
                    frequency, channelConfiguration,
                    audioEncoding, bufferSize);

            short[] buffer = new short[bufferSize];
            audioRecord.startRecording();


            while (audioRecord.getRecordingState() == audioRecord.RECORDSTATE_RECORDING) {
                int bufferReadResult = audioRecord.read(buffer, 0, bufferSize);
                for (int i = 0; i < bufferReadResult; i++)
                    dos.writeShort(buffer[i]);
            }


            audioRecord.stop();
            dos.close();

        } catch (Throwable t) {
            Log.e("AudioRecord", "Recording Failed");
        }
        Log.v("ACS", "Record stopped");

    }

    public void onDestroy(){
        audioRecord.stop();
        Log.v("ACS", "onDestroy called, Record stopped");

    }

}

提前致谢

科里:)

2 个答案:

答案 0 :(得分:1)

我有相同的错误消息“android.media.AudioTrack:多通道配置中必须存在前通道”。

当我将音频设置从AudioFormat.CHANNEL_OUT_MONO更改为 AudioFormat.CHANNEL_IN_MONO 时,错误消息消失。 (或者您可以尝试不同的配置,例如 AudioFormat.CHANNEL_IN_STEREO

AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
                    11025,
                    AudioFormat.CHANNEL_IN_MONO,
                    AudioFormat.ENCODING_PCM_16BIT,
                    musicLength,
                    AudioTrack.MODE_STREAM);

但我不知道为什么会这样。希望这有帮助。

答案 1 :(得分:0)

单声道音频文件需要同时发送到左右扬声器。做一个逻辑 OR 来设置这个路由:

final int frontPair = AudioFormat.CHANNEL_OUT_FRONT_LEFT | AudioFormat.CHANNEL_OUT_FRONT_RIGHT;

    AudioFormat audioFormat = new AudioFormat.Builder()
            .setEncoding(AudioFormat.ENCODING_PCM_8BIT)
            .setSampleRate(audioSamplingRate)
            .setChannelMask(frontPair)
            .build();