我正在尝试为我已经正常运行的应用程序添加波形视觉效果,该应用程序将录制声音,然后播放。录制/播放代码工作正常,但是当我尝试添加com.pheelicks.app可视化工具并录制我自己的声音时,然后尝试播放它,当我点击我自己的播放按钮时出现崩溃。
在他的可视化应用程序中,他提供了自己的mp3声音文件,通过MediaPlayer播放。但由于我也通过MediaPlayer播放录制的声音,我认为应该很容易调整我的代码以包含他的可视化器部分。但是出了点问题。
奇怪的是,他的代码在我的Android手机(三星Galaxy 5)中完美运行,我可以看到音乐的可视化器。
我试图研究类似的问题,但我在那里找不到答案。我尝试了this和this这两个类似的错误,但我没有找到解决方案。
错误似乎来自他的代码,位图部分中的 VisualizerView.java ,其中getWidth()
和getHeight()
来自Canvas
onDraw()
方法。我还尝试记录这些值,但它们没有显示在我的LogCat
中。我正在使用Android Studio。谢谢你的帮助!
编辑:
啊,我看到我的Log
声明不起作用,因为它是在崩溃之后放置的。我把它放在崩溃点之前:
if (mCanvasBitmap == null) {
mCanvasBitmap = Bitmap.createBitmap(canvas.getWidth(),
canvas.getHeight(), Config.ARGB_8888);
}
我从LogCat得到了这个:Value of getWidth is 924 and value of getHeight is 0
。所以问题是,为什么高度为零?
VisualizerView.java
package org.azurespot.waveform;
/**
* Created by mizu on 2/2/15.
*/
import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.Bitmap.Config;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Matrix;
import android.graphics.Paint;
import android.graphics.PorterDuff.Mode;
import android.graphics.PorterDuffXfermode;
import android.graphics.Rect;
import android.media.MediaPlayer;
import android.media.audiofx.Visualizer;
import android.util.AttributeSet;
import android.view.View;
import java.util.HashSet;
import java.util.Set;
import static android.util.Log.d;
/**
* A class that draws visualizations of data received from a
* {@link Visualizer.OnDataCaptureListener#onWaveFormDataCapture } and
* {@link Visualizer.OnDataCaptureListener#onFftDataCapture }
*/
public class VisualizerView extends View {
private static final String TAG = "VisualizerView";
private byte[] mBytes;
private byte[] mFFTBytes;
private Rect mRect = new Rect();
private Visualizer mVisualizer;
private Set<Renderer> mRenderers;
private Paint mFlashPaint = new Paint();
private Paint mFadePaint = new Paint();
public VisualizerView(Context context, AttributeSet attrs, int defStyle) {
super(context, attrs);
init();
}
public VisualizerView(Context context, AttributeSet attrs) {
this(context, attrs, 0);
}
public VisualizerView(Context context) {
this(context, null, 0);
}
private void init() {
mBytes = null;
mFFTBytes = null;
mFlashPaint.setColor(Color.argb(122, 255, 255, 255));
mFadePaint.setColor(Color.argb(238, 255, 255, 255)); // Adjust alpha to change how quickly the image fades
mFadePaint.setXfermode(new PorterDuffXfermode(Mode.MULTIPLY));
mRenderers = new HashSet<Renderer>();
}
/**
* Links the visualizer to a player
*
* @param player - MediaPlayer instance to link to
*/
public void link(MediaPlayer player) {
if (player == null) {
throw new NullPointerException("Cannot link to null MediaPlayer");
}
// Create the Visualizer object and attach it to our media player.
mVisualizer = new Visualizer(player.getAudioSessionId());
mVisualizer.setEnabled(false);
mVisualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
// Pass through Visualizer data to VisualizerView
Visualizer.OnDataCaptureListener captureListener = new Visualizer.OnDataCaptureListener() {
@Override
public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes,
int samplingRate) {
updateVisualizer(bytes);
}
@Override
public void onFftDataCapture(Visualizer visualizer, byte[] bytes,
int samplingRate) {
updateVisualizerFFT(bytes);
}
};
mVisualizer.setDataCaptureListener(captureListener,
Visualizer.getMaxCaptureRate() / 2, true, true);
// Enabled Visualizer and disable when we're done with the stream
mVisualizer.setEnabled(true);
player.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
@Override
public void onCompletion(MediaPlayer mediaPlayer) {
mVisualizer.setEnabled(false);
}
});
}
public void addRenderer(Renderer renderer) {
if (renderer != null) {
mRenderers.add(renderer);
}
}
public void clearRenderers() {
mRenderers.clear();
}
/**
* Call to release the resources used by VisualizerView. Like with the
* MediaPlayer it is good practice to call this method
*/
public void release() {
mVisualizer.release();
}
/**
* Pass data to the visualizer. Typically this will be obtained from the
* Android Visualizer.OnDataCaptureListener call back. See
* {@link Visualizer.OnDataCaptureListener#onWaveFormDataCapture }
*
* @param bytes
*/
public void updateVisualizer(byte[] bytes) {
mBytes = bytes;
invalidate();
}
/**
* Pass FFT data to the visualizer. Typically this will be obtained from the
* Android Visualizer.OnDataCaptureListener call back. See
* {@link Visualizer.OnDataCaptureListener#onFftDataCapture }
*
* @param bytes
*/
public void updateVisualizerFFT(byte[] bytes) {
mFFTBytes = bytes;
invalidate();
}
boolean mFlash = false;
/**
* Call this to make the visualizer flash. Useful for flashing at the start
* of a song/loop etc...
*/
public void flash() {
mFlash = true;
invalidate();
}
Bitmap mCanvasBitmap;
Canvas mCanvas;
@Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
// Create canvas once we're ready to draw
mRect.set(0, 0, getWidth(), getHeight());
if (mCanvasBitmap == null) {
mCanvasBitmap = Bitmap.createBitmap(canvas.getWidth(),
canvas.getHeight(), Config.ARGB_8888);
}
d("DEBUG", "Value of getWidth is " + canvas.getWidth()
+ " and value of getHeight is " + canvas.getHeight());
if (mCanvas == null) {
mCanvas = new Canvas(mCanvasBitmap);
}
if (mBytes != null) {
// Render all audio renderers
AudioData audioData = new AudioData(mBytes);
for (Renderer r : mRenderers) {
r.render(mCanvas, audioData, mRect);
}
}
if (mFFTBytes != null) {
// Render all FFT renderers
FFTData fftData = new FFTData(mFFTBytes);
for (Renderer r : mRenderers) {
r.render(mCanvas, fftData, mRect);
}
}
// Fade out old contents
mCanvas.drawPaint(mFadePaint);
if (mFlash) {
mFlash = false;
mCanvas.drawPaint(mFlashPaint);
}
canvas.drawBitmap(mCanvasBitmap, new Matrix(), null);
}
}
logcat的
02-02 22:31:16.699 19125-19125/? E/AndroidRuntime﹕ FATAL EXCEPTION: main
Process: org.azurespot, PID: 19125
java.lang.IllegalArgumentException: width and height must be > 0
at android.graphics.Bitmap.createBitmap(Bitmap.java:922)
at android.graphics.Bitmap.createBitmap(Bitmap.java:901)
at android.graphics.Bitmap.createBitmap(Bitmap.java:868)
at org.azurespot.waveform.VisualizerView.onDraw(VisualizerView.java:176)
at android.view.View.draw(View.java:15393)
at android.view.View.getDisplayList(View.java:14287)
at android.view.View.getDisplayList(View.java:14329)
at android.view.ViewGroup.dispatchGetDisplayList(ViewGroup.java:3284)
at android.view.View.getDisplayList(View.java:14224)
at android.view.View.getDisplayList(View.java:14329)
at android.view.ViewGroup.dispatchGetDisplayList(ViewGroup.java:3284)
at android.view.View.getDisplayList(View.java:14224)
at android.view.View.getDisplayList(View.java:14329)
at android.view.ViewGroup.dispatchGetDisplayList(ViewGroup.java:3284)
at android.view.View.getDisplayList(View.java:14224)
at android.view.View.getDisplayList(View.java:14329)
at android.view.ViewGroup.dispatchGetDisplayList(ViewGroup.java:3284)
at android.view.View.getDisplayList(View.java:14224)
at android.view.View.getDisplayList(View.java:14329)
at android.view.ViewGroup.dispatchGetDisplayList(ViewGroup.java:3284)
at android.view.View.getDisplayList(View.java:14224)
at android.view.View.getDisplayList(View.java:14329)
at android.view.HardwareRenderer$GlRenderer.buildDisplayList(HardwareRenderer.java:1576)
at android.view.HardwareRenderer$GlRenderer.draw(HardwareRenderer.java:1455)
at android.view.ViewRootImpl.draw(ViewRootImpl.java:2754)
at android.view.ViewRootImpl.performDraw(ViewRootImpl.java:2620)
at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:2188)
at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:1249)
at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:6585)
at android.view.Choreographer$CallbackRecord.run(Choreographer.java:803)
at android.view.Choreographer.doCallbacks(Choreographer.java:603)
at android.view.Choreographer.doFrame(Choreographer.java:573)
at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:789)
at android.os.Handler.handleCallback(Handler.java:733)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:136)
at android.app.ActivityThread.main(ActivityThread.java:5579)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:515)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1268)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1084)
at dalvik.system.NativeStart.main(Native Method)
答案 0 :(得分:2)
原来我的xml中包含FrameLayout
小部件的VisualizerView
是罪魁祸首!代码来自com.pheelicks.app,因此我没有想到任何内容,因为我已经看到FrameLayout
在0dp
之前显示此类型的大小(在片段中)。但是,奇怪的是,我决定让它更大,并且可视化器确实出现了。它是这些小东西的惊人之处。不遗余力!下面是我的xml中的小部件,我将高度更改为200dp
(来自0dp
)并修复了它。
<强> activity_make_sounds.xml 强>
<FrameLayout
android:layout_width="fill_parent"
android:layout_height="200dp"
android:layout_margin="10dp"
android:background="#000" >
<org.azurespot.waveform.VisualizerView
android:id="@+id/visualizerView"
android:layout_width="fill_parent"
android:layout_height="fill_parent" >
</org.azurespot.waveform.VisualizerView>
</FrameLayout>