Microsoft Azure Face API返回" com.microsoft.projectoxford.face.rest.ClientException:图像大小太小。"

时间:2018-02-27 17:26:16

标签: android azure azure-analysis-services face-api

当我将照片发送到Microsoft Azure Face API(the apitutorial)时,我正在接收

  

com.microsoft.projectoxford.face.rest.ClientException:图片尺寸太小。

但是当我调试应用程序时以及检查以下代码时

faceServiceClient.detect( params[0], false, false, expectedFaceAttributes )

实际上它正在工作,我可以得到结果,但只是第一次。如果我按"检查"再一次,我将再次收到上述错误消息。

P.S。我尝试使用不同的图像,行为是一样的。我将不胜感激任何帮助。

申请表:

public class MainActivity extends AppCompatActivity {

    Bitmap bmpImage;
    TextView txtResult;

    ByteArrayInputStream bs;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        txtResult = (TextView) findViewById(R.id.txtResult);
        Button btnMicrosoft = (Button) findViewById(R.id.btnMicrosoft);

        btnMicrosoft.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                getFaces();
                makeMicrosoftCall();
            }
        });

    }

    private void getFaces() {
        bmpImage = BitmapFactory.decodeResource(this.getResources(), R.drawable.face_1);

    }

    private void makeMicrosoftCall() {
        imgEncoding();
        GetEmotionCall emotionCall = new GetEmotionCall();
        emotionCall.execute(bs);

    }

    public void imgEncoding()
    {
        ByteArrayOutputStream baos = new ByteArrayOutputStream();
        bmpImage.compress(Bitmap.CompressFormat.JPEG, 100, baos);
        bs = new ByteArrayInputStream(baos.toByteArray());
    }

    // asynchronous class which makes the api call in the background
    private class GetEmotionCall extends AsyncTask<InputStream, String, Face[]> {
        GetEmotionCall() {
        }


        @Override
        protected void onPreExecute() {
            super.onPreExecute();
            txtResult.setText("Getting results...");
        }

        // this function is called when the api call is made
        @Override
        protected Face[] doInBackground(InputStream... params) {
            FaceServiceClient faceServiceClient = new FaceServiceRestClient("https://westcentralus.api.cognitive.microsoft.com/face/v1.0", "*******************");

            // the only attribute wanted it emotional state
            FaceServiceClient.FaceAttributeType[] expectedFaceAttributes = new FaceServiceClient.FaceAttributeType[]{FaceServiceClient.FaceAttributeType.Emotion};

            try {
                //THE PROBLEMATIC AREA
                return faceServiceClient.detect( params[0], false, false, expectedFaceAttributes );      

            } catch (ClientException e) {
                Log.e("ClientException", e.toString());
                return null;
            } catch (IOException e) {
                Log.e("IOException", e.toString());
                e.printStackTrace();
                return null;
            }
        }
    }

}

0 个答案:

没有答案