在Runnable中调用Intent会使android studio中的应用程序崩溃

时间:2018-08-08 10:54:52

标签: android multithreading android-intent crash

我刚接触Android。我现在正在尝试从可运行对象调用意图。但是,该应用程序不断崩溃。我试图将其包装在处理程序中,但没有任何效果。

编辑:添加了logcat屏幕截图。我注意到显示的错误是“这不是beta用户版本” 截图链接:https://drive.google.com/open?id=1XEyWnjiXm7X6gzx84wTXx6R_vJN9srWl

runInBackground(
                new Runnable() {
                    @Override
                    public void run() {
                       
                                if ( condition has met) {
                                    
                                    //go to other activity
                                    Intent intent = new Intent(getBaseContext(), CameraActivity.class);
                                    startActivity(intent);
                                   }
                    }
                });

下面的代码段是我的完整代码,实际上我是从Tensorflow Object Detection中获得的。

@Override
    protected void processImage() {
        final Button admin_see_questions = (Button) findViewById(R.id.button);
        final KonfettiView konfettiView = (KonfettiView) findViewById(R.id.konfettiView);

        ++timestamp;
        final long currTimestamp = timestamp;
        byte[] originalLuminance = getLuminance();
        tracker.onFrame(
                previewWidth,
                previewHeight,
                getLuminanceStride(),
                sensorOrientation,
                originalLuminance,
                timestamp);
        trackingOverlay.postInvalidate();

        // No mutex needed as this method is not reentrant.
        if (computingDetection) {
            readyForNextImage();
            return;
        }
        computingDetection = true;
        LOGGER.i("Preparing image " + currTimestamp + " for detection in bg thread.");

        rgbFrameBitmap.setPixels(getRgbBytes(), 0, previewWidth, 0, 0, previewWidth, previewHeight);

        if (luminanceCopy == null) {
            luminanceCopy = new byte[originalLuminance.length];
        }
        System.arraycopy(originalLuminance, 0, luminanceCopy, 0, originalLuminance.length);
        readyForNextImage();

        final Canvas canvas = new Canvas(croppedBitmap);
        canvas.drawBitmap(rgbFrameBitmap, frameToCropTransform, null);
        // For examining the actual TF input.
        if (SAVE_PREVIEW_BITMAP) {
            ImageUtils.saveBitmap(croppedBitmap);
        }

        final Handler mHandler = new Handler(getMainLooper());

        final Button bt1 = (Button) findViewById(R.id.button2);
        runInBackground(
                new Runnable() {
                    @Override
                    public void run() {
                        LOGGER.i("Running detection on image " + currTimestamp);
                        final long startTime = SystemClock.uptimeMillis();
                        final List<Classifier.Recognition> results = detector.recognizeImage(croppedBitmap);
                        lastProcessingTimeMs = SystemClock.uptimeMillis() - startTime;

                        cropCopyBitmap = Bitmap.createBitmap(croppedBitmap);
                        final Canvas canvas = new Canvas(cropCopyBitmap);
                        final Paint paint = new Paint();
                        paint.setColor(Color.RED);
                        paint.setStyle(Style.STROKE);
                        paint.setStrokeWidth(2.0f);

                        float minimumConfidence = MINIMUM_CONFIDENCE_TF_OD_API;
                        switch (MODE) {
                            case TF_OD_API:
                                minimumConfidence = MINIMUM_CONFIDENCE_TF_OD_API;
                                break;
                            case MULTIBOX:
                                minimumConfidence = MINIMUM_CONFIDENCE_MULTIBOX;
                                break;
                            case YOLO:
                                minimumConfidence = MINIMUM_CONFIDENCE_YOLO;
                                break;
                        }



                        final List<Classifier.Recognition> mappedRecognitions =
                                new LinkedList<Classifier.Recognition>();

                            for (final Classifier.Recognition result : results) {
                                final RectF location = result.getLocation();
                                if (location != null && result.getConfidence() >= minimumConfidence) {
                                    canvas.drawRect(location, paint);

                                    //Toast.makeText(DetectorActivity.this, result.getTitle(), Toast.LENGTH_LONG).show();

                                    cropToFrameTransform.mapRect(location);
                                    result.setLocation(location);
                                    mappedRecognitions.add(result);
                                    //found = true;
                                    Log.i("err","Transiting to other intent");
                                    Intent intent = new Intent(DetectorActivity.this, display.class);
                                    startActivity(intent);
                                   
                                }
                            }


                        tracker.trackResults(mappedRecognitions, luminanceCopy, currTimestamp);
                        trackingOverlay.postInvalidate();

                        requestRender();
                        computingDetection = false;

                    }
                });
}

1 个答案:

答案 0 :(得分:0)

尝试用YourActivity.this替换getBaseContext()。

 if ( condition has met) {                                     
        //go to other activity
       Intent intent = new Intent(YourActivity.this, CameraActivity.class);
       startActivity(intent);
       }

如果这不起作用,请在此处粘贴您的Logcat。