Microsoft Face API(使用Android)中缺少属性类型。我如何使用Emotion属性?

时间:2018-04-17 04:07:24

标签: java android face-detection microsoft-cognitive face-api

我正在尝试使用Microsoft的Face API与Android Studio制作应用。现在,我只是在玩API,但我遇到了问题。在应该可用的面部属性类型中(参见on this page),我可以选择的唯一属性类型是Age,FacialHair,Gender,HeadPose和Smile。我真的想使用Emotion属性类型,但它无法识别。

我得到的错误:Cannot resolve symbol 'Emotion'

以下是代码的相关部分:

Face[] result = faceServiceClient.detect(inputStreams[0], true, true, new FaceServiceClient.FaceAttributeType[]{FaceServiceClient.FaceAttributeType.Emotion});

这是我的MainActivity中的整个代码:

package me.ianterry.face;

import android.app.ProgressDialog;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.os.AsyncTask;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ImageView;

import com.microsoft.projectoxford.face.*;
import com.microsoft.projectoxford.face.contract.*;

import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.InputStream;

public class MainActivity extends AppCompatActivity {
    private FaceServiceClient faceServiceClient =
            new FaceServiceRestClient("https://westcentralus.api.cognitive.microsoft.com/face/v1.0", "MY KEY");

    private ImageView mImageView;
    private Button mProcessButton;
    private ProgressDialog progress;
    public final String TAG = "attributeMethod";

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        final Bitmap myBitmap = BitmapFactory.decodeResource(getResources(), R.drawable.test_image);
        mImageView = findViewById(R.id.image);
        mImageView.setImageBitmap(myBitmap);

        mProcessButton = findViewById(R.id.btn_process);
        mProcessButton.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                detectAndFrame(myBitmap);
            }
        });
        progress = new ProgressDialog(this);
    }


    private void detectAndFrame(final Bitmap myBitmap) {
        ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
        myBitmap.compress(Bitmap.CompressFormat.JPEG, 100, outputStream);
        ByteArrayInputStream inputStream = new ByteArrayInputStream(outputStream.toByteArray());


        AsyncTask<InputStream, String, Face[]> detectTask = new AsyncTask<InputStream, String, Face[]>() {
            //private ProgressDialog progress = new ProgressDialog(MainActivity.this);

            @Override
            protected void onPostExecute(Face[] faces) {
                progress.dismiss();
                if (faces == null) {
                    return;
                }
                mImageView.setImageBitmap(drawFaceRectangleOnBitmap(myBitmap, faces));
                attributeMethod(faces);
            }

            @Override
            protected void onPreExecute() {
                super.onPreExecute();
                progress.show();
            }

            @Override
            protected void onProgressUpdate(String... values) {
                super.onProgressUpdate(values);
                progress.setMessage(values[0]);
            }

            @Override
            protected Face[] doInBackground(InputStream... inputStreams) {
                //return new Face[0];
                try {

                    publishProgress("Detecting...");
                    Face[] result = faceServiceClient.detect(inputStreams[0], true, true, new FaceServiceClient.FaceAttributeType[]{FaceServiceClient.FaceAttributeType.Emotion});
                    if (result == null) {
                        publishProgress("Detection finished. Nothing detected.");
                        return null;
                    }
                    publishProgress(String.format("Detection Finished. %d face(s) detected", result.length));
                    return result;
                } catch (Exception e) {
                    publishProgress("Detection failed.");
                    return null;
                }

            }

        };
        detectTask.execute(inputStream);
    }

    private static Bitmap drawFaceRectangleOnBitmap(Bitmap myBitmap, Face[] faces) {
        Bitmap bitmap = myBitmap.copy(Bitmap.Config.ARGB_8888, true);
        Canvas canvas = new Canvas(bitmap);
        Paint paint = new Paint();
        paint.setAntiAlias(true);
        paint.setStyle(Paint.Style.STROKE);
        paint.setColor(Color.WHITE);
        int strokeWidth = 8;
        paint.setStrokeWidth(strokeWidth);
        if (faces != null) {
            for (Face face : faces) {
                FaceRectangle faceRectangle = face.faceRectangle;
                canvas.drawRect(faceRectangle.left,
                        faceRectangle.top,
                        faceRectangle.left + faceRectangle.width,
                        faceRectangle.top + faceRectangle.height,
                        paint);

            }
        }
        return bitmap;
    }

    private void attributeMethod(Face[] faces) {
        for (Face face : faces) {
            FaceAttribute attribute = face.faceAttributes;
            Log.d(TAG, "age: " + attribute.age);
            Log.d(TAG, "gender: " + attribute.gender);
        }
    }
}

此代码或多或少直接来自this tutorial.

1 个答案:

答案 0 :(得分:1)

SDK版本1.2.5中添加了对Emotion的支持。 Source

在版本1.4.3发布之前,您应该使用版本1.4.1。