我正在使用视觉API进行人脸检测,现在我想要实现眨眼,但仍然可以通过视觉api检测到人的图像(照片)中的眼睛眨眼(不是现场)。
此外,我正在使用追踪器跟踪眼睛状态,以检测指示左眼眨眼的事件序列:
左眼睁开 - >左眼闭着 - >左眼睁开
GraphicFaceTracker类的定义如下:
private class GraphicFaceTracker extends Tracker<Face> {
private GraphicOverlay mOverlay;
private FaceGraphic mFaceGraphic;
private Context context ;
GraphicFaceTracker(Context context, GraphicOverlay overlay) {
mOverlay = overlay;
this.context= context;
mFaceGraphic = new FaceGraphic(overlay);
}
private final float OPEN_THRESHOLD = 0.85f;
private final float CLOSE_THRESHOLD = 0.4f;
private int state = 0;
void blink(float value, final int eyeNo, String whichEye) {
switch (state) {
case 0:
if (value > OPEN_THRESHOLD) {
// Both eyes are initially open
state = 1;
}
break;
case 1:
if (value < CLOSE_THRESHOLD ) {
// Both eyes become closed
state = 2;
}
break;
case 2:
if (value > OPEN_THRESHOLD) {
// Both eyes are open again
Log.i("BlinkTracker", "blink occurred!");
mCameraSource.takePicture(null, new CameraSource.PictureCallback() {
@Override
public void onPictureTaken(byte[] bytes) {
Bitmap bmp = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
Log.d("BITMAP", bmp.getWidth() + "x" + bmp.getHeight());
System.out.println(bmp.getWidth() + "x" + bmp.getHeight());
}
});
state = 0;
}
break;
}
}
/**
* Start tracking the detected face instance within the face overlay.
*/
@Override
public void onNewItem(int faceId, Face item) {
mFaceGraphic.setId(faceId);
}
/**
* Update the position/characteristics of the face within the overlay.
*/
@Override
public void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) {
mOverlay.add(mFaceGraphic);
mFaceGraphic.updateFace(face);
float left = face.getIsLeftEyeOpenProbability();
float right = face.getIsRightEyeOpenProbability();
if (left == Face.UNCOMPUTED_PROBABILITY) {
// At least one of the eyes was not detected.
return;
}
blink(left,0,"left");
if(right == Face.UNCOMPUTED_PROBABILITY ){
return ;
}
}
}
我启用了&#34;分类&#34;为了让探测器指示眼睛是否打开/关闭:
FaceDetector detector = new FaceDetector.Builder(context)
.setProminentFaceOnly(true) // optimize for single, relatively large face
.setTrackingEnabled(true) // enable face tracking
.setClassificationType(/* eyes open and smile */ FaceDetector.ALL_CLASSIFICATIONS)
.setMode(FaceDetector.FAST_MODE) // for one face this is OK
.build();
然后将跟踪器添加为处理器,用于随时间从检测器接收面部更新。例如,此配置将用于跟踪视图中最大的面是否闪烁:
Tracker<Face> tracker = new GraphicFaceTracker(this,mGraphicOverlay);
detector.setProcessor(new LargestFaceFocusingProcessor.Builder(detector, tracker).build());
但是上面的代码检测到一个人的图像闪烁。但是一个人的形象不能眨眼。如何通过相机检测眨眼?
答案 0 :(得分:1)
从Face对象可以获得低于概率。
float leftOpenScore = face.getIsLeftEyeOpenProbability();
if (leftOpenScore == Face.UNCOMPUTED_PROBABILITY) {//left eye is open }else{//left eye closed }
float leftOpenScore = face.getIsRightEyeOpenProbability();
if (leftOpenScore == Face.UNCOMPUTED_PROBABILITY) {//Right eye is open }else{//Right eye closed }
现在您可以将此值传递到您想要使用的位置。
答案 1 :(得分:1)
这是一个Github项目open source eye blink detector for Android,它可以在Android中实时检测眨眼,这是在FaceDetectorApi上实现的
答案 2 :(得分:0)
我觉得这看起来是正确的。如果您将检测器与正在运行的CameraSource实例相关联,如下例所示:
https://developers.google.com/vision/android/face-tracker-tutorial
将跟踪摄像机的眼球运动。我还认为您可能会稍微更改onUpdate代码以更好地确定闪烁阈值:
@Override
public void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) {
mOverlay.add(mFaceGraphic);
mFaceGraphic.updateFace(face);
float left = face.getIsLeftEyeOpenProbability();
float right = face.getIsRightEyeOpenProbability();
if ((left == Face.UNCOMPUTED_PROBABILITY) ||
(right == Face.UNCOMPUTED_PROBABILITY)) {
// One of the eyes was not detected.
return;
}
float value = Math.min(left, right);
blink(value);
}
答案 3 :(得分:0)
您可以将探测器传递到摄像机光源并从曲面视图处理闪烁检测。
public class LivelinessScanFragment extends Fragment {
SurfaceView cameraView;
CameraSource cameraSource;
final int RequestCameraPermissionID = 1001;
FaceDetector detector;
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
switch (requestCode) {
case RequestCameraPermissionID: {
if (grantResults[0] == PackageManager.PERMISSION_GRANTED) {
if (ActivityCompat.checkSelfPermission(getActivity(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
return;
}
try {
cameraSource.start(cameraView.getHolder());
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
public LivelinessScanFragment() {
// Required empty public constructor
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
// Inflate the layout for this fragment
View rootView = inflater.inflate(R.layout.fragment_liveliness_scan, container, false);
cameraView = (SurfaceView)rootView.findViewById(R.id.surface_view);
detector = new FaceDetector.Builder(getActivity())
.setProminentFaceOnly(true) // optimize for single, relatively large face
.setTrackingEnabled(true) // enable face tracking
.setClassificationType(/* eyes open and smile */ FaceDetector.ALL_CLASSIFICATIONS)
.setMode(FaceDetector.FAST_MODE) // for one face this is OK
.build();
if (!detector.isOperational()) {
Log.w("MainActivity", "Detector Dependencies are not yet available");
} else {
cameraSource = new CameraSource.Builder(Application.getContext(), detector)
.setFacing(CameraSource.CAMERA_FACING_FRONT)
.setRequestedFps(2.0f)
.setRequestedPreviewSize(1280, 1024)
.setAutoFocusEnabled(true)
.build();
cameraView.getHolder().addCallback(new SurfaceHolder.Callback() {
@Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
try {
if (ActivityCompat.checkSelfPermission(Application.getContext(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(getActivity(),
new String[]{Manifest.permission.CAMERA}, RequestCameraPermissionID);
return;
}
cameraSource.start(cameraView.getHolder());
detector.setProcessor(
new LargestFaceFocusingProcessor(detector, new GraphicFaceTracker()));
} catch (IOException e) {
e.printStackTrace();
}
}
@Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {
}
@Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
cameraSource.stop();
}
});
}
return rootView;
}
private class GraphicFaceTracker extends Tracker<Face> {
private final float OPEN_THRESHOLD = 0.85f;
private final float CLOSE_THRESHOLD = 0.4f;
private int state = 0;
void blink(float value) {
switch (state) {
case 0:
if (value > OPEN_THRESHOLD) {
// Both eyes are initially open
state = 1;
}
break;
case 1:
if (value < CLOSE_THRESHOLD ) {
// Both eyes become closed
state = 2;
}
break;
case 2:
if (value > OPEN_THRESHOLD) {
// Both eyes are open again
Log.i("BlinkTracker", "blink occurred!");
state = 0;
}
break;
}
}
/**
* Update the position/characteristics of the face within the overlay.
*/
@Override
public void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) {
float left = face.getIsLeftEyeOpenProbability();
float right = face.getIsRightEyeOpenProbability();
if ((left == Face.UNCOMPUTED_PROBABILITY) ||
(right == Face.UNCOMPUTED_PROBABILITY)) {
// One of the eyes was not detected.
return;
}
float value = Math.min(left, right);
blink(value);
}
}
}