当我们需要流媒体音频时,没有太多关于应用此编解码器的信息。如果不应用编解码器,我的代码就像在两个设备之间建立通信的魅力一样,但我需要以该格式进行编码/解码,因为我需要与服务器进行流式处理而不是两个设备之间的流式传输(我使用2个设备测试此代码)。
我正在寻找机会,如果你的任何人都能看到问题的关键在哪里。我尝试过不同的输入参数配置。也许,我使用的编解码器是错误的(我使用Apache许可证从一个项目中获取它们。)
此值在录像机发送器中设置,如播放器 - 接收器设备中所示:
private int port=50005;
private int sampleRate = 8000 ;//44100;
private int channelConfig = AudioFormat.CHANNEL_OUT_MONO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
int minBufSize = AudioTrack.getMinBufferSize(sampleRate, channelConfig, audioFormat);
注意:播放器中的CHANNEL_OUT_MONO和录像机项目中的CHANNEL_IN_MONO。
这些是我的方法:
public void startStreamingEncoding() {
Thread streamThread = new Thread(new Runnable() {
@Override
public void run() {
try {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
DatagramSocket socket = new DatagramSocket();
short[] buffer = new short[minBufSize];
DatagramPacket packet;
final InetAddress destination = InetAddress.getByName(ip_receiver);
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,sampleRate,channelConfig,audioFormat,minBufSize*10);
recorder.startRecording();
/////Encoding:
Encoder encoder = new G711UCodec();
byte[] outBuffer = new byte[minBufSize];
while(status == true) {
//reading data from MIC into buffer
minBufSize = recorder.read(buffer, 0, buffer.length);
//Encoding:
encoder.encode(buffer, minBufSize, outBuffer, 0);
//putting buffer in the packet
packet = new DatagramPacket (outBuffer, outBuffer.length, destination,port);
socket.send(packet);
}
} catch(UnknownHostException e) {
Log.e("VS", "UnknownHostException");
} catch (IOException e) {
e.printStackTrace();
Log.e("VS", "IOException");
}
}
});
streamThread.start();
}
播放和解码流的方法:
public void playerAudioDecoding()
{
Thread thrd = new Thread(new Runnable() {
@Override
public void run()
{
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC,
sampleRate, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT, minBufSize,
AudioTrack.MODE_STREAM);
track.play();
Decoder decoder = new G711UCodec();
try
{
DatagramSocket sock = new DatagramSocket(port);
byte[] buf = new byte[minBufSize];
while(true)
{
DatagramPacket pack = new DatagramPacket(buf, minBufSize);
sock.receive(pack);
//Decoding:
int size = pack.getData().length;
short[] shortArray = new short[size];
decoder.decode(shortArray, pack.getData(), minBufSize, 0);
byte[] array = MyShortToByte(shortArray);
track.write(array, 0, array.length);
}
}
catch (SocketException se)
{
Log.e("Error", "SocketException: " + se.toString());
}
catch (IOException ie)
{
Log.e("Error", "IOException" + ie.toString());
}
} // end run
});
thrd.start();
}
这是我在Apache许可证中使用的编解码器类:
public class G711UCodec implements Encoder, Decoder {
// s00000001wxyz...s000wxyz
// s0000001wxyza...s001wxyz
// s000001wxyzab...s010wxyz
// s00001wxyzabc...s011wxyz
// s0001wxyzabcd...s100wxyz
// s001wxyzabcde...s101wxyz
// s01wxyzabcdef...s110wxyz
// s1wxyzabcdefg...s111wxyz
private static byte[] table13to8 = new byte[8192];
private static short[] table8to16 = new short[256];
static {
// b13 --> b8
for (int p = 1, q = 0; p <= 0x80; p <<= 1, q+=0x10) {
for (int i = 0, j = (p << 4) - 0x10; i < 16; i++, j += p) {
int v = (i + q) ^ 0x7F;
byte value1 = (byte) v;
byte value2 = (byte) (v + 128);
for (int m = j, e = j + p; m < e; m++) {
table13to8[m] = value1;
table13to8[8191 - m] = value2;
}
}
}
// b8 --> b16
for (int q = 0; q <= 7; q++) {
for (int i = 0, m = (q << 4); i < 16; i++, m++) {
int v = (((i + 0x10) << q) - 0x10) << 3;
table8to16[m ^ 0x7F] = (short) v;
table8to16[(m ^ 0x7F) + 128] = (short) (65536 - v);
}
}
}
public int decode(short[] b16, byte[] b8, int count, int offset) {
for (int i = 0, j = offset; i < count; i++, j++) {
b16[i] = table8to16[b8[j] & 0xFF];
}
return count;
}
public int encode(short[] b16, int count, byte[] b8, int offset) {
for (int i = 0, j = offset; i < count; i++, j++) {
b8[j] = table13to8[(b16[i] >> 4) & 0x1FFF];
}
return count;
}
public int getSampleCount(int frameSize) {
return frameSize;
}
}
真的,我不知道它发生了什么;如果我将sampleRate更改为4000,我可以识别我的声音和一些单词,但有很多回声。我再说一遍,如果禁用编码/解码过程并在PCM中进行流式传输,那么质量非常好。让我们看看是否有人可以帮助我并提前感谢。
答案 0 :(得分:6)
好的,最后我自己解决了编码/解码音频的问题。在上周,这是一项烦人的任务。我的代码的主要问题是编码工作做得很好但是解码并不是因为我正在解决它并在其他资源的帮助下修改这些类,并且我已经创建了自己的编码/解码方法(这些都像魅力!!!)。
其他重要决定是改变编码格式。现在我正在使用alaw,而不再是ulaw。我做这个改变的唯一原因是因为编程上比ulaw更容易实现alaw。
此外,我还必须使用参数作为缓冲区大小等等。
我将提交我的代码,我希望你们中的某人可以使用我的参考资料节省这么多时间。
private int port=50005;
private int sampleRate = 8000; //44100;
private int channelConfig = AudioFormat.CHANNEL_IN_MONO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
public void startStreamingEncoding() {
Thread streamThread = new Thread(new Runnable() {
@Override
public void run() {
try {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
DatagramSocket socket = new DatagramSocket();
byte[] buffer = new byte[4096];
DatagramPacket packet;
final InetAddress destination = InetAddress.getByName(ip_receiver);
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,sampleRate,channelConfig,audioFormat, minBufSize * 10);
recorder.startRecording();
/////Encoding:
CMG711 encoder = new CMG711();
byte[] outBuffer = new byte[4096];
int read, encoded;
File sdCard = Environment.getExternalStorageDirectory();
FileOutputStream out = new FileOutputStream( new File( sdCard ,"audio-bernard.raw" ));
while(status == true) {
//reading data from MIC into buffer
read = recorder.read(buffer, 0, buffer.length);
Log.d(getTag(), "read: "+read );
//Encoding:
encoded = encoder.encode(buffer,0, read, outBuffer);
//putting buffer in the packet
packet = new DatagramPacket (outBuffer, encoded, destination,port);
out.write( outBuffer, 0, encoded );
socket.send(packet);
}
} catch(UnknownHostException e) {
Log.e("VS", "UnknownHostException");
} catch (IOException e) {
e.printStackTrace();
Log.e("VS", "IOException");
}
}
});
streamThread.start();
}
对于接收者和玩家类或方法:
private int port=50005;
private int sampleRate = 8000 ;//44100;
private int channelConfig = AudioFormat.CHANNEL_OUT_MONO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
int minBufSize = AudioTrack.getMinBufferSize(sampleRate, channelConfig, audioFormat);
public void playerAudioDecodingBernard()
{
Thread thrd = new Thread(new Runnable() {
@Override
public void run()
{
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC,
sampleRate, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, minBufSize * 10,
AudioTrack.MODE_STREAM);
CMG711 decoder = new CMG711();
try
{
DatagramSocket sock = new DatagramSocket(port);
byte[] buf = new byte[4096];
int frame = 0;
while(true)
{
DatagramPacket pack = new DatagramPacket(buf, 4096);
sock.receive(pack);
//Decoding:
int size = pack.getLength();
//Log.d( "Player", "Player: "+ size +", "+pack.getLength() + ", "+pack.getOffset() );
byte[] byteArray = new byte[size*2];
decoder.decode(pack.getData(), 0, size, byteArray);
track.write(byteArray, 0, byteArray.length);
if( frame++ > 3 )
track.play();
}
}
catch (SocketException se)
{
Log.e("Error", "SocketException: " + se.toString());
}
catch (IOException ie)
{
Log.e("Error", "IOException" + ie.toString());
}
} // end run
});
thrd.start();
}
这个是以alaw格式编码/解码的类:
public class CMG711
{
/** decompress table constants */
private static short aLawDecompressTable[] = new short[]
{ -5504, -5248, -6016, -5760, -4480, -4224, -4992, -4736, -7552, -7296, -8064, -7808, -6528, -6272, -7040, -6784, -2752, -2624, -3008, -2880, -2240, -2112, -2496, -2368, -3776, -3648, -4032, -3904, -3264, -3136, -3520, -3392, -22016, -20992, -24064, -23040, -17920, -16896, -19968, -18944, -30208, -29184, -32256, -31232, -26112, -25088, -28160, -27136, -11008, -10496, -12032, -11520, -8960, -8448, -9984, -9472, -15104, -14592, -16128, -15616, -13056, -12544, -14080, -13568, -344, -328, -376,
-360, -280, -264, -312, -296, -472, -456, -504, -488, -408, -392, -440, -424, -88, -72, -120, -104, -24, -8, -56, -40, -216, -200, -248, -232, -152, -136, -184, -168, -1376, -1312, -1504, -1440, -1120, -1056, -1248, -1184, -1888, -1824, -2016, -1952, -1632, -1568, -1760, -1696, -688, -656, -752, -720, -560, -528, -624, -592, -944, -912, -1008, -976, -816, -784, -880, -848, 5504, 5248, 6016, 5760, 4480, 4224, 4992, 4736, 7552, 7296, 8064, 7808, 6528, 6272, 7040, 6784, 2752, 2624,
3008, 2880, 2240, 2112, 2496, 2368, 3776, 3648, 4032, 3904, 3264, 3136, 3520, 3392, 22016, 20992, 24064, 23040, 17920, 16896, 19968, 18944, 30208, 29184, 32256, 31232, 26112, 25088, 28160, 27136, 11008, 10496, 12032, 11520, 8960, 8448, 9984, 9472, 15104, 14592, 16128, 15616, 13056, 12544, 14080, 13568, 344, 328, 376, 360, 280, 264, 312, 296, 472, 456, 504, 488, 408, 392, 440, 424, 88, 72, 120, 104, 24, 8, 56, 40, 216, 200, 248, 232, 152, 136, 184, 168, 1376, 1312, 1504, 1440, 1120,
1056, 1248, 1184, 1888, 1824, 2016, 1952, 1632, 1568, 1760, 1696, 688, 656, 752, 720, 560, 528, 624, 592, 944, 912, 1008, 976, 816, 784, 880, 848 };
private final static int cClip = 32635;
private static byte aLawCompressTable[] = new byte[]
{ 1, 1, 2, 2, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7 };
public int encode( byte[] src, int offset, int len, byte[] res )
{
int j = offset;
int count = len / 2;
short sample = 0;
for ( int i = 0; i < count; i++ )
{
sample = (short) ( ( ( src[j++] & 0xff ) | ( src[j++] ) << 8 ) );
res[i] = linearToALawSample( sample );
}
return count;
}
private byte linearToALawSample( short sample )
{
int sign;
int exponent;
int mantissa;
int s;
sign = ( ( ~sample ) >> 8 ) & 0x80;
if ( !( sign == 0x80 ) )
{
sample = (short) -sample;
}
if ( sample > cClip )
{
sample = cClip;
}
if ( sample >= 256 )
{
exponent = (int) aLawCompressTable[( sample >> 8 ) & 0x7F];
mantissa = ( sample >> ( exponent + 3 ) ) & 0x0F;
s = ( exponent << 4 ) | mantissa;
}
else
{
s = sample >> 4;
}
s ^= ( sign ^ 0x55 );
return (byte) s;
}
public void decode( byte[] src, int offset, int len, byte[] res )
{
int j = 0;
for ( int i = 0; i < len; i++ )
{
short s = aLawDecompressTable[src[i + offset] & 0xff];
res[j++] = (byte) s;
res[j++] = (byte) ( s >> 8 );
}
}
}
希望对你有用的人有用!无论如何,谢谢你的帮助,特别是对bonnyz。
答案 1 :(得分:1)
您尝试过哪种sampleRate?采样率(在播放和录制中)都非常重要,因为它涉及整个音频管道,只有少数设置可以保证在每个设备上工作(我确定44100)。另外,请记住,不能指定随机sampleRate(如4000),因为它们将(或应该)缩放到最近支持的sampleRate。类似的考虑也适用于缓冲区大小。
我的猜测是,错误的管道设置会产生声音伪影,这些伪影会在&#34;压缩后退化。步骤
如果您使用44100设置客户端会怎样?
您可以尝试查询AudioManager,然后测试支持的varuous sampleRate / buffersize
AudioManager audioManager = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
String rate = audioManager.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE);
String size = audioManager.getProperty(AudioManager.PROPERTY_OUTPUT_FRAMES_PER_BUFFER);