我正在编写一个小应用程序,它捕获来自android MIC的音频,对输入执行FFT,然后将图表绘制给用户。我正在尝试同时进行录制和绘图(显然从录制到绘图有一点延迟)。我正在尝试启动两个线程,一个用于读取,另一个用于处理。但是,当我处理时,我遇到同步问题似乎只接收(或不接收)零。任何建议将不胜感激。 :)
public class Plotter extends Activity {
/* plotting objects */
private static GraphicalView mView;
private LineGraph line = new LineGraph();
private boolean recordAudio = true; // record?
private AudioRecord mRecorder = null; // audio object
private Menu mMenu; // app menu
private static final String LOG_TAG = "Frequency Plotter"; // debug tag
private Mfft mfft = null; // FFT class
private static final int BUF_SIZE = 8192; // amount to read in
private Thread listener = null;
private Thread processor = null;
Stack<Float> items = new Stack<Float>();
/* colors for line */
private int[] colors = {Color.BLUE,Color.CYAN,Color.DKGRAY,Color.GRAY,
Color.GREEN,Color.LTGRAY,Color.MAGENTA,Color.RED,Color.WHITE,Color.YELLOW};
private void processAudio(){
ArrayList<Double> real = new ArrayList<Double>();
try{
Random randomGenerator = new Random();
float[] in = new float[2048];
Arrays.fill(in,1);
while(true){
synchronized(items){
while(items.size() < 2048)
items.wait();
items.notifyAll();
for(int i=0; i < 2048; i++){
in[i] = items.pop();
}
}
double[] ret = mfft.fft(2048,44100,in); // get FFT of data
TimeSeries dataset = new TimeSeries( (real.size()+1)/2048 + "" );
XYSeriesRenderer renderer = new XYSeriesRenderer(); // customized renderer
// Customization time
renderer.setColor(colors[randomGenerator.nextInt(10)]);
renderer.setPointStyle(PointStyle.SQUARE);
renderer.setFillPoints(true);
line.addRenderer(renderer); // add custom renderer
for(int i = 0; i < 2048; i++){
real.add(ret[i]);
dataset.add(real.size()-1,ret[i]); // Add it to our graph
}
line.addDataset(dataset); // add data to line
mView.repaint(); // render lines
}
}catch(Exception e){
Log.e(LOG_TAG, e + " ");
}
}
private void writeToBuffer(short[] in) {
synchronized(items){
for(int i = 0; i < BUF_SIZE; i++){ // copy to create float
items.push((float)in[i]);
}
items.notifyAll();
}
}
private void listen(){
final short[] in = new short[BUF_SIZE];
mRecorder = new AudioRecord(
MediaRecorder.AudioSource.MIC, // source
44100, // frequency (HERTZ)
AudioFormat.CHANNEL_IN_MONO, // channel
AudioFormat.ENCODING_PCM_16BIT, // format
BUF_SIZE // size data packet
);
mRecorder.startRecording();
while(recordAudio){
try{
/* read next part */
mRecorder.read(in,0,BUF_SIZE); // read from device
writeToBuffer(in);
}catch(Exception t){
/* something went horribly wrong!!!*/
recordAudio = false;
Log.e(LOG_TAG, "Failure reading" + t.getMessage());
}
}
}
private void startRecording(){
/* create a new thread that will run the recording in the background */
listener = new Thread(
new Runnable(){
public void run(){
listen();
}
});
listener.start();
/* small delay to produce */
try {
Thread.sleep(100);
} catch (InterruptedException e1) {
e1.printStackTrace();
}
/* create a thread to process the audio */
processor = new Thread(
new Runnable(){
public void run(){
processAudio();
}
});
processor.start();
}
private void stopRecording(){
recordAudio = false;
mRecorder.stop();
mRecorder.release();
mRecorder = null;
}
/** clear the current chart */
private void clearChart(){
line = new LineGraph();
this.onStart();
}
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
}
@Override
protected void onStart() {
super.onStart();
/* instantiate */
mfft = new Mfft(); // instance of the FFT class
mView = line.getView(this); // get the chart view
/* new horizontal layout */
LinearLayout ll = new LinearLayout(this);
ll.setOrientation(LinearLayout.HORIZONTAL);
ll.addView(mView); // add chart to layout
setContentView(ll); // set layout
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle item selection
switch (item.getItemId()) {
case R.id.record:
startRecording();
item.setEnabled(false); // disable start
mMenu.findItem(R.id.stop).setEnabled(true); // enable stop
return true;
case R.id.stop:
stopRecording();
item.setEnabled(false); // disable stop
mMenu.findItem(R.id.clear).setEnabled(true); // enable stop
return true;
case R.id.clear:
clearChart(); // clear chart
item.setEnabled(false); // disable clear
mMenu.findItem(R.id.record).setEnabled(true); // enable stop
return true;
default:
return super.onOptionsItemSelected(item);
}
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
mMenu = menu;
MenuInflater inflater = getMenuInflater();
inflater.inflate(R.menu.my_menu, menu);
return true;
}
}
编辑:添加完整定义。
答案 0 :(得分:2)
不幸的是,作者已经停止了对该项目的开发,但源代码tarball仍然可以在线获取。特别注意:org.hermit.android.io.AudioReader.java
。您阅读音频并通过Stack
对象传递,该作者使用short []
数组。 (似乎它似乎不应该是你的问题来源......)
http://code.google.com/p/moonblink/downloads/detail?name=SourceTarball.zip
您的音频缓冲区(BUF_SIZE = 8192)感觉有点小。这与AudioRecord.getMinBufferSize()
有何关系?我使用了2x minBufferSize,而且没有对它进行任何计算(只读/写)。
Handler
想法我仍在检查您的代码,不清楚您的线程如何通信。但是,您的问题听起来像线程需要一种方式来传达Handler
。
以下是我一直在审查的链接,以了解如何使用Handler
并在线程之间进行有效沟通:
线程 - 处理程序的概述(without looper)。 w /代码示例:com.indy.testing.TestMain.java.MyThread.java
http://indyvision.net/2010/02/android-threads-tutorial-part-3/
线程 - 确定处理程序和loopers的概述 http://techtej.blogspot.com/2011/02/android-passing-data-between-main.html
线程w / 2way comm。 w / code示例:sample.thread.messaging.ThreadMessaging.java http://codinghard.wordpress.com/2009/05/16/android-thread-messaging/