我基本上是尝试从webrtc流录制音频块,我已经能够借助此资源HTML Audio Capture streaming to Node.js发送二进制数据。
我正在使用netty-socketio,因为这个库可以很好地与客户端的socket-io配合使用。 这是我的服务器端点:
server.addEventListener("audio-blob", byte[].class, (socketIOClient, bytes, ackRequest) -> {
byteArrayList.add(bytes);
});
server.addEventListener("audio-blob-end", Object.class, (socket, string, ackRequest) -> {
ByteArrayInputStream in = new ByteArrayInputStream(byteArrayList.getArray());
AudioInputStream audiIn = new AudioInputStream(in, getAudioFormat(), 48000l);
AudioFileFormat.Type fileType = AudioFileFormat.Type.WAVE;
File wavFile = new File("RecordAudio.wav");
AudioSystem.write(audiIn,fileType,wavFile);
});
格式设置:
public static AudioFormat getAudioFormat() {
float sampleRate = 48000;
int sampleSizeInBits = 8;
int channels = 2;
boolean signed = true;
boolean bigEndian = true;
AudioFormat format = new AudioFormat(sampleRate, sampleSizeInBits,
channels, signed, bigEndian);
return format;
}
我使用这个类来收集字节数组(是的,我知道这个解决方案的风险)
class ByteArrayList {
private List<Byte> bytesList;
public ByteArrayList() {
bytesList = new ArrayList<Byte>();
}
public void add(byte[] bytes) {
add(bytes, 0, bytes.length);
}
public void add(byte[] bytes, int offset, int length) {
for (int i = offset; i < (offset + length); i++) {
bytesList.add(bytes[i]);
}
}
public int size(){
return bytesList.size();
}
public byte[] getArray() {
byte[] bytes = new byte[bytesList.size()];
for (int i = 0; i < bytesList.size(); i++) {
bytes[i] = bytesList.get(i);
}
return bytes;
}
}
生成的wav文件只播放噪音,但不存在录音。我做错了什么?
答案 0 :(得分:0)
当谷歌:寻找答案时,我偶然发现了这个资源how to save a wav file
我做错了是我在AudioInputstream构造函数参数上有一个固定的大小:
new AudioInputStream(in, getAudioFormat(), 48000l)
将其更改为:
new AudioInputStream(in, getAudioFormat(),byteArrayList.getArray().length);