我的应用程序中有一个耗时的任务,需要几分钟才能完成。任务是使用ORB算法进行图像匹配,将图像查询与图库中的所有图像进行比较,然后将相似的图像返回到listView,由于该任务需要很长时间,因此我更喜欢使用Asynctask添加进度对话框。问题在于,当我按下搜索按钮时,进度对话框会出现很长时间,然后应用程序崩溃,因此似乎当该过程完成时,应用程序崩溃了,而不是隐藏进度对话框并在listview中显示了结果。该代码可以正常运行而无需进度Dialog和asynctask)。我也尝试使用线程和同样的问题。任何帮助将不胜感激。预先感谢
搜索按钮代码:
searchButton.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
BackgroundTaskSearch task = new BackgroundTaskSearch(RGBtoGrey.this);
task.execute();
}
});
AsyncTask代码:
private class BackgroundTaskSearch extends AsyncTask <Void, Void, Void> {
private ProgressDialog dialog;
public BackgroundTaskSearch(RGBtoGrey activity) {
dialog = new ProgressDialog(activity);
}
@Override
protected void onPreExecute() {
dialog.setMessage("Doing something, please wait.");
dialog.setCanceledOnTouchOutside(false);
dialog.show();
}
@Override
protected void onPostExecute(Void result) {
if (dialog.isShowing()) {
dialog.dismiss();
/* try
{
dialog.dismiss();
}
catch (Exception e)
{
}*/
}
}
@Override
protected Void doInBackground(Void... params) {
search(); // Calling method of the long time task
return null;
}
}
locat:行(RGBtoGrey.java:1030)是dialog.show();在onPreExecute()和行(RGBtoGrey.java:331)中是task.execute();在搜索按钮中
search()方法代码:
public void search(){
Mat qmat = new Mat();
Mat jsonmat =null;
String q = qtag.getText().toString().trim();
if (!searchType.equals("byImage") && (q.isEmpty() || q.length() == 0 || q.equals("") || q == null)) {
Toast.makeText(getApplicationContext(), "Please insert image or tag", Toast.LENGTH_LONG).show();
} else {
if(!searchType.equals("byImage")) {
DataBaseHandler db2 = new DataBaseHandler(getApplicationContext());
List<image> list = db2.getImages(q);
for (int i = 0; i < list.size(); i++) {
imageList.add(list.get(i));
}
if (imageList.size() != 0 && imageList.size() > 1)
t.setText(Integer.toString(imageList.size()) + " images found");
if (imageList.size() != 0 && imageList.size() == 1)
t.setText(Integer.toString(imageList.size()) + " image found");
if (imageList.size() == 0)
t.setText("No result");
adapter.notifyDataSetChanged();
}
if (TYPE.equals("gallery")) {
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
Bitmap qbitmap0 = BitmapFactory.decodeFile(picPath, bmOptions);
Bitmap qbitmap = getRotated(qbitmap0, picPath);
Mat qmatRGB = new Mat();
Utils.bitmapToMat(qbitmap, qmatRGB);
Imgproc.cvtColor(qmatRGB, qmat, Imgproc.COLOR_RGB2GRAY);
org.opencv.core.Size s = new Size(3, 3);
Imgproc.GaussianBlur(qmat, qmat, s, 2);
}
if (TYPE.equals("camera")) {
Mat qmatRGB = new Mat();
Utils.bitmapToMat(photo, qmatRGB);
Imgproc.cvtColor(qmatRGB, qmat, Imgproc.COLOR_RGB2GRAY);
org.opencv.core.Size s = new Size(3, 3);
Imgproc.GaussianBlur(qmat, qmat, s, 2);
}
ArrayList<String> pathArray = getFilePaths();
DataBaseHandler db = new DataBaseHandler(getApplicationContext());
List<mat> matlist =db.getAllMats();
FeatureDetector detector = FeatureDetector.create(FeatureDetector.ORB);
MatOfKeyPoint keypoints1 = new MatOfKeyPoint();
detector.detect(qmat, keypoints1);
DescriptorExtractor extractor = DescriptorExtractor.create(DescriptorExtractor.ORB);
Mat descriptors1 = new Mat();
extractor.compute(qmat, keypoints1, descriptors1);
for (int i = 0; i < pathArray.size(); i++) {
BitmapFactory.Options bmOptions1 = new BitmapFactory.Options();
Bitmap bitmap0 = BitmapFactory.decodeFile(pathArray.get(i).toString(), bmOptions1);
Bitmap bitmap = getRotated(bitmap0, pathArray.get(i).toString());
Mat mat = new Mat();
Mat matRGB = new Mat();
Utils.bitmapToMat(bitmap, matRGB);
Imgproc.cvtColor(matRGB, mat, Imgproc.COLOR_RGB2GRAY);
org.opencv.core.Size s2 = new Size(3, 3);
Imgproc.GaussianBlur(mat, mat, s2, 2);
mat m =matlist.get(i);
String smat = m.getMat();
String smatkey = m.getMatimage();
Mat descriptors2 = matFromJson(smat);
Mat keypoints2 = keypointsFromJson(smatkey);
DescriptorMatcher matcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMING);
MatOfDMatch matches = new MatOfDMatch();
matcher.match(descriptors1, descriptors2, matches);
List<DMatch> matchesList = matches.toList();
List<DMatch> matches_final = new ArrayList<DMatch>();
for (int j = 0; j < matchesList.size(); j++)
if (matchesList.get(j).distance <= DIST_LIMIT) {
matches_final.add(matches.toList().get(j));
}
List<MatOfDMatch> matcheslis = new ArrayList<MatOfDMatch>();
matcher.knnMatch(descriptors1, descriptors2,
matcheslis, 2);
ArrayList<KeyPoint> objectPoints = new ArrayList<KeyPoint>(), imagePoints = new ArrayList<KeyPoint>();
for (MatOfDMatch match : matcheslis) {
DMatch[] dmatches = match.toArray();
if (dmatches.length == 2
&& dmatches[0].distance < dmatches[1].distance * 0.75) {
imagePoints
.add(keypoints1.toArray()[dmatches[0].queryIdx]);
objectPoints
.add(((MatOfKeyPoint) keypoints2).toArray()[dmatches[0].trainIdx]);
}
}
float ratio = ((float) objectPoints.size())
/ ((float) keypoints2.size().width);
// ration>=12
if (ratio >= 16 || matches_final.size() >= 147) {
image Image = new image();
Image.setImageURL(pathArray.get(i).toString());
Image.setGoodMatches(matches_final.size());
Image.setRatio(ratio);
imageList.add(Image);
}
}
for (int k = 0; k < imageList.size(); k++) {
if (imageList.get(k).getImageURL().equals(picPath))
imageList.remove(k);
}
if (imageList.size() != 0 && imageList.size() > 1)
t.setText(Integer.toString(imageList.size()) + " images found");
if (imageList.size() != 0 && imageList.size() == 1)
t.setText(Integer.toString(imageList.size()) + " image found");
if (imageList.size() == 0)
t.setText("No result");
adapter.notifyDataSetChanged();
}
}
答案 0 :(得分:2)
据我从您的代码中看到的,您正在尝试使用doInBackground
方法更新UI。 doInBackground
在工作线程中运行,在android中,您只能在 main 线程中更新UI。用doInBackground
方法做很长时间,但是用主线程方法更新UI,例如onProgressUpdate
或onPostExecute
。有关asynctask here