我是java编程的新手,我特别是DSP编程,但我试图将混响应用于.wav文件。我使用的一些代码来自:Reverb Algorithm,但似乎我并不完全理解它。当我运行我的代码时,我会得到随着时间的推移而增长的噪音,最终也会“削波”。我认为它与我的缓冲区有关,或者我使用“剪辑”来播放精彩的音频流,但我不确定。 如果有人能够看一下并提出一些改进我的代码的想法,我将非常感激。
我使用此代码:
public void Reverbstart() throws InterruptedException, UnsupportedAudioFileException, IOException, LineUnavailableException {
int bufferLength = 4000_000;
Clip clip;
Line line;
Line.Info linfo = new Line.Info(Clip.class);
line = AudioSystem.getLine(linfo);
clip = (Clip) line;
File sourceFile = new File("some audiofile");
AudioFileFormat fileFormat = AudioSystem.getAudioFileFormat(sourceFile);
AudioFormat audioFormat = fileFormat.getFormat();
System.out.println(audioFormat);
AudioInputStream ais = AudioSystem.getAudioInputStream(sourceFile);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
int nBufferSize = bufferLength * audioFormat.getFrameSize();
byte[] byteBuffer = new byte[nBufferSize];
int nBytesRead = ais.read(byteBuffer);
baos.write(byteBuffer, 0, nBytesRead);
byte[] AudioData = baos.toByteArray();
int delayMilliseconds = 3000;
int delaySamples = (int)((float)delayMilliseconds * 44.1f); //44100 Hz sample rate
float decay = 0.5f;
for (int i = 0; i < AudioData.length - delaySamples; i++){
AudioData[i] += (short)((float)AudioData[i]);
AudioData[i + delaySamples] += (short)((float)AudioData[i] * decay);
}
ByteArrayInputStream bais = new ByteArrayInputStream(AudioData);
AudioInputStream outputAis = new AudioInputStream(bais, audioFormat,AudioData.length/ audioFormat.getFrameSize());
clip.open(outputAis);
clip.start();
Thread.sleep(10000);
System.out.println(clip.getFramePosition());
}