我正在尝试编写一个简单的Android应用,该应用在播放.wav文件时显示各个音高。我正在使用TarsosDSP进行音高处理,并使用AudioTrack播放文件。
在深入研究代码之前,我正在运行带有JRE 1.8.0的Android Studio 3.4.2。 而我的minSdkVersion是23。
根据对TarsosDSP的工作原理的了解,我将wav流连接到AudioDispatcher对象,连接了处理器(播放器本身和音高评估器),然后将调度程序分配给一个线程并启动它以踢完一切关。据我了解,我可能还在做一些愚蠢的事情(某处...)。
使用AudioTrack Builder时遇到了问题,因为我发现很多示例都利用了现在不推荐使用的使用AudioManager.STREAM_MUSIC的构造函数。
更新:我设法找到一个或多或少正在做我想做的事情的人(只是必须让Google将其翻译成韩文): https://junyoung-jamong.github.io/signal/processing/2019/02/09/How-to-use-tarsosDSP-in-Android.html
重构之后,我能够移动我在AudioMethods类内部所做的AudioPlayer内容。
目标1:获取要播放的文件。 完成 您会注意到,我从未真正调用过AndroidAudioPlayer.process()(播放文件)。这是因为我不确定应该在哪里调用它。
目标2:获取音高 完成 关于应如何连接TarsosDSP,我严重地缺少一些东西。非常感谢这里的任何帮助。
目标3:播放文件而不是白噪声 新 因此,更新代码后,我开始播放文件,并且音调评估似乎正在起作用,只是文件现在发出尖锐的声音/白噪声。我确实找到了这个SO线程: Android AudioTrack playing .wav file, getting only white noise
但这是从2011年开始的,他使用的是不推荐使用的AudioTrack构造函数,该构造函数使用STREAM_MUSIC枚举。
主要活动
public class MainActivity extends AppCompatActivity {
private TextView local_NoteText;
private TextView local_PitchText;
AudioDispatcher audioDispatcher;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
local_PitchText = findViewById(R.id.pitchText);
local_NoteText = findViewById(R.id.noteText);
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu_main, menu);
return true;
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
public void ProcessTone(View view) throws IOException {
//get the test file
final AssetFileDescriptor afd = this.getResources().openRawResourceFd(R.raw.avery_test);
//read the start offset (skip the wav header), and end time
/*double startTime = afd.getStartOffset();
double endTime = afd.getLength();
Log.d("Process Tone","startTime: " + startTime);
Log.d("Process Tone","endTime: " + endTime);*/
//fire up the player and evaluate the pitch
AudioMethods audioMethods = new AudioMethods();
TarsosDSPAudioFormat tarsosDSPAudioFormat = new TarsosDSPAudioFormat(TarsosDSPAudioFormat.Encoding.PCM_SIGNED,
22050,
2 * 8,
1,
2 * 1,
22050,
ByteOrder.BIG_ENDIAN.equals(ByteOrder.nativeOrder()));
//audioMethods.getPitchFromFile(afd.createInputStream(), startTime, endTime, local_NoteText,local_PitchText);
audioMethods.getPitchFromFile(afd.getFileDescriptor(), MainActivity.this,audioDispatcher,tarsosDSPAudioFormat, local_NoteText,local_PitchText);
}
}
音频方法
public class AudioMethods {
public void getPitchFromFile(FileDescriptor fd,final Activity activity, AudioDispatcher dispatcher,TarsosDSPAudioFormat tarsosDSPAudioFormat, final TextView pitchText,final TextView noteText) {
try {
releaseDispatcher(dispatcher);
FileInputStream fileInputStream = new FileInputStream(fd);
dispatcher = new AudioDispatcher(new UniversalAudioInputStream(fileInputStream, tarsosDSPAudioFormat), 1024, 0);
AudioProcessor playerProcessor = new AndroidAudioPlayer(tarsosDSPAudioFormat, 2048, 0);
dispatcher.addAudioProcessor(playerProcessor);
PitchDetectionHandler pitchDetectionHandler = new PitchDetectionHandler() {
public void handlePitch(PitchDetectionResult res, AudioEvent e) {
final float pitchInHz = res.getPitch();
activity.runOnUiThread(new Runnable() {
@Override
public void run() {
pitchText.setText(pitchInHz + activity.getString(R.string.pitch_info));
}
});
}
public void processPitch(float pitchInHz) {
if(pitchInHz >= 110 && pitchInHz < 123.47) {
//A
noteText.setText("A");
}
else if(pitchInHz >= 123.47 && pitchInHz < 130.81) {
//B
noteText.setText("B");
}
else if(pitchInHz >= 130.81 && pitchInHz < 146.83) {
//C
noteText.setText("C");
}
else if(pitchInHz >= 146.83 && pitchInHz < 164.81) {
//D
noteText.setText("D");
}
else if(pitchInHz >= 164.81 && pitchInHz <= 174.61) {
//E
noteText.setText("E");
}
else if(pitchInHz >= 174.61 && pitchInHz < 185) {
//F
noteText.setText("F");
}
else if(pitchInHz >= 185 && pitchInHz < 196) {
//G
noteText.setText("G");
}
}
};
AudioProcessor pitchProcessor = new PitchProcessor(PitchProcessor.PitchEstimationAlgorithm.FFT_YIN, 22050, 1024, pitchDetectionHandler);
dispatcher.addAudioProcessor(pitchProcessor);
Thread audioThread = new Thread(dispatcher, "Audio Thread");
audioThread.start();
} catch (Exception e) {
e.printStackTrace();
}
}
public void releaseDispatcher(AudioDispatcher dispatcher)
{
if(dispatcher != null)
{
if(!dispatcher.isStopped())
dispatcher.stop();
dispatcher = null;
}
}
//I don't need these guys yet
/*public void stopRecording()
{
releaseDispatcher();
}
@Override
protected void onStop() {
super.onStop();
releaseDispatcher();
}*/
}