android中的MediaRecorder类用于录制来自麦克风的音频,任何人都可以告诉我如何录制实际在耳机上播放的音频。听起来很技术,但是我正在探索的是它。我被告知“Visualizer”类可以录制系统音频,但根据文档,它只能用于可视化音频,我们不能把录音机接口放在那里。
了解详情:http://developer.android.com/reference/android/media/audiofx/Visualizer.html
以下是否有任何目的?
int CAMCORDER
int DEFAULT
int MIC
int REMOTE_SUBMIX
int VOICE_CALL
int VOICE_COMMUNICAITON
int vOICE_DOWNLINK
int VOICE_RECOGNITION
int VOICE_UPLINK
有没有人参与过OpenSLES?听说这也是为了它的目的
如果您遇到任何Android API或第三方API,请随时分享信息。很少有博客也说这可以在NDK级别完成。如果有人使用它或者有代码示例,请通知
由于
显示迈克尔的示例代码:
public class VisualizerView extends View {
private static final String TAG = "VisualizerView";
private byte[] mBytes;
private byte[] mFFTBytes;
private Rect mRect = new Rect();
private Visualizer mVisualizer;
private Set<Renderer> mRenderers;
private Paint mFlashPaint = new Paint();
private Paint mFadePaint = new Paint();
private ByteArrayOutputStream buffer;
public VisualizerView(Context context, AttributeSet attrs, int defStyle)
{
super(context, attrs);
init();
}
public VisualizerView(Context context, AttributeSet attrs)
{
this(context, attrs, 0);
}
public VisualizerView(Context context)
{
this(context, null, 0);
}
private void init() {
mBytes = null;
mFFTBytes = null;
mFlashPaint.setColor(Color.argb(122, 255, 255, 255));
mFadePaint.setColor(Color.argb(238, 255, 255, 255)); // Adjust alpha to change how quickly the image fades
mFadePaint.setXfermode(new PorterDuffXfermode(Mode.MULTIPLY));
mRenderers = new HashSet<Renderer>();
}
/**
* Links the visualizer to a player
* @param player - MediaPlayer instance to link to
*/
public void link(MediaPlayer player)
{
if(player == null)
{
throw new NullPointerException("Cannot link to null MediaPlayer");
}
// Create the Visualizer object and attach it to our media player.
mVisualizer = new Visualizer(player.getAudioSessionId());
mVisualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
// Pass through Visualizer data to VisualizerView
Visualizer.OnDataCaptureListener captureListener = new Visualizer.OnDataCaptureListener()
{
@Override
public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes,
int samplingRate)
{
updateVisualizer(bytes);
//Record
if (bytes.length>-1)
buffer.write(bytes, 0, bytes.length);
//Record ends
}
@Override
public void onFftDataCapture(Visualizer visualizer, byte[] bytes,
int samplingRate)
{
updateVisualizerFFT(bytes);
}
};
mVisualizer.setDataCaptureListener(captureListener,
Visualizer.getMaxCaptureRate() / 2, true, true);
// Enabled Visualizer and disable when we're done with the stream
mVisualizer.setEnabled(true);
player.setOnCompletionListener(new MediaPlayer.OnCompletionListener()
{
@Override
public void onCompletion(MediaPlayer mediaPlayer)
{
mVisualizer.setEnabled(false);
//Save File
try {
buffer.flush();
} catch (IOException e) {
e.printStackTrace();
}
mBytes = buffer.toByteArray();
try {
buffer.close();
} catch (IOException e) {
e.printStackTrace();
}
mVisualizer.release();
File file = new File(Environment.getExternalStorageDirectory(), "music1.wav");
FileOutputStream fos;
try {
fos = new FileOutputStream(file);
fos.write(mBytes);
fos.flush();
fos.close();
} catch (FileNotFoundException e) {
// handle exception
} catch (IOException e) {
// handle exception
}
//Save File ends
}
});
}
public void addRenderer(Renderer renderer)
{
if(renderer != null)
{
mRenderers.add(renderer);
}
}
public void clearRenderers()
{
mRenderers.clear();
}
/**
* Call to release the resources used by VisualizerView. Like with the
* MediaPlayer it is good practice to call this method
*/
public void release()
{
mVisualizer.release();
}
/**
* Pass data to the visualizer. Typically this will be obtained from the
* Android Visualizer.OnDataCaptureListener call back. See
* {@link Visualizer.OnDataCaptureListener#onWaveFormDataCapture }
* @param bytes
*/
public void updateVisualizer(byte[] bytes) {
mBytes = bytes;
invalidate();
}
/**
* Pass FFT data to the visualizer. Typically this will be obtained from the
* Android Visualizer.OnDataCaptureListener call back. See
* {@link Visualizer.OnDataCaptureListener#onFftDataCapture }
* @param bytes
*/
public void updateVisualizerFFT(byte[] bytes) {
mFFTBytes = bytes;
invalidate();
}
boolean mFlash = false;
/**
* Call this to make the visualizer flash. Useful for flashing at the start
* of a song/loop etc...
*/
public void flash() {
mFlash = true;
invalidate();
}
Bitmap mCanvasBitmap;
Canvas mCanvas;
@Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
// Create canvas once we're ready to draw
mRect.set(0, 0, getWidth(), getHeight());
if(mCanvasBitmap == null)
{
mCanvasBitmap = Bitmap.createBitmap(canvas.getWidth(), canvas.getHeight(), Config.ARGB_8888);
}
if(mCanvas == null)
{
mCanvas = new Canvas(mCanvasBitmap);
}
if (mBytes != null) {
// Render all audio renderers
AudioData audioData = new AudioData(mBytes);
for(Renderer r : mRenderers)
{
r.render(mCanvas, audioData, mRect);
}
}
if (mFFTBytes != null) {
// Render all FFT renderers
FFTData fftData = new FFTData(mFFTBytes);
for(Renderer r : mRenderers)
{
r.render(mCanvas, fftData, mRect);
}
}
// Fade out old contents
mCanvas.drawPaint(mFadePaint);
if(mFlash)
{
mFlash = false;
mCanvas.drawPaint(mFlashPaint);
}
canvas.drawBitmap(mCanvasBitmap, new Matrix(), null);
}
}
答案 0 :(得分:2)
任何人都可以告诉我如何录制实际在耳机上播放的音频。
您无法做到,因为Android API中没有正式支持。如果您使用Java API或NDK中包含的本机API,则无关紧要 如果你有root权限等,可能会有针对特定设备的黑客攻击,但我不打算覆盖那些。如果您有兴趣,可以尝试搜索,看看您能想出什么。
我被告知&#34; Visualizer&#34; class可以录制系统音频,但根据文档,它只能用于可视化音频,我们不能在那里放置录音机接口。
Visualizer
有这种方法:
public int getWaveForm (byte[] waveform)
返回当前播放音频内容的波形捕获。捕获包括 在许多连续的8位(无符号)单声道PCM采样中,等于捕获大小 由
getCaptureSize()
返回。
所以你可以使用Visualizer
录制当前播放的音频。但正如上面的描述中所提到的,您只会获得低质量的音频数据,因为此方法的目的是获取可用于可视化目的的音频数据,而不是用于一般录制目的。