使用AudioTrack在Android上流G711 ulaw

时间:2013-05-02 05:06:27

标签: android audio audio-streaming

我正在尝试通过在Android手机上以g711 ulaw 8 khz,8位样本编码的Multipart HTTP流从Axis网络安全摄像头传输实时音频。看起来这应该是非常直接的,这是我的代码的基础。我重用了一些我从MJPEG流中抓取JPEG帧的流代码,现在它抓取512字节的音频数据块并将其交给AudioTrack。音频听起来都是乱码和扭曲的,我错过了一些明显的东西吗?

@Override
public void onResume() {
    super.onResume();
    int bufferSize = AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_8BIT);
    mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 8000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_8BIT, bufferSize, AudioTrack.MODE_STREAM);
    mAudioTrack.play();
    thread.start();
}

class StreamThread extends Thread {
    public boolean running = true;

    public void run() {
        try {
            MjpegStreamer streamer = MjpegStreamer.read("/axis-cgi/audio/receive.cgi?httptype=multipart");

            while(running) {
                byte[] buf = streamer.readMjpegFrame();
                if(buf != null && mAudioTrack != null) {
                    mAudioTrack.write(buf, 0, buf.length);
                }
            }


        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
    }
}

0 个答案:

没有答案