Android ultra simple integration of living detection technology to quickly identify "fake face"

Android ultra simple integration of living detection technology to quickly identify "fake face"

preface

Have you ever had such concerns? Is it really safe to brush your face and unlock it? If someone impersonates me with my photos or videos, can the mobile phone find that I am not in front of the camera? Of course. Huawei HMS ML Kit in vivo detection technology can accurately distinguish between real face and "false face". Whether it's face remake photos, face video replay, or face masks, living detection technology can immediately expose these "fake faces" and make them invisible!

Application scenario

In vivo detection technology is usually used before face comparison technology. First confirm that there is a real person in front of the camera, rather than someone faking a photo or mask, and then compare whether the current face and the entered face are the same person. In vivo detection technology has a wide range of applications in life. For example, when the mobile phone is unlocked, the in vivo detection technology can prevent someone from pretending to unlock the mobile phone, resulting in the disclosure of personal information.

Or when dealing with financial business, in vivo detection technology can be used in the process of real name authentication. First judge that the current face is a real face, and then compare the current face with the photo information on the ID card to confirm that the person dealing with the business is the person on the ID card, so as to effectively prevent others from posing as themselves and causing property losses.

It's convenient for users to do silent face detection, and it's not convenient for users to judge whether it's real or not. The following compendium introduces how to quickly integrate in vivo detection technology.

Development practice

1. Development preparation

For detailed preparation steps, please refer to Huawei developer Alliance:
https://developer.huawei.com/consumer/cn/doc/development/HMS-Guides/ml-process-4
Here are the key development steps.

1.1 configure Maven warehouse address in project level gradle

buildscript {
    repositories {
             ...
        maven {url 'https://developer.huawei.com/repo/'}
    }
}
 dependencies {
                 ...
        classpath 'com.huawei.agconnect:agcp:1.3.1.300'
    }
allprojects {
    repositories {
             ...
        maven {url 'https://developer.huawei.com/repo/'}
    }
}

1.2 configuring SDK dependencies in application gradle

 dependencies{
        // The in vivo detection set package is introduced.
        implementation 'com.huawei.hms:ml-computer-vision-livenessdetection:2.0.2.300'
}

1.3 add configuration in file header

apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'

1.4 add the following statement to androidmanifest XML file, automatically update the machine learning model to the device

<meta-data 
  android:name="com.huawei.hms.ml.DEPENDENCY" 
  android:value= "livenessdetection"/>

1.5 application for camera authority

For the specific operation steps of camera permission application, please refer to: https://developer.huawei.com/consumer/cn/doc/development/HMSCore-Guides/add-permissions-0000001050040051

2. Code development

2.1 create a callback for in vivo test results to obtain test results.

  private MLLivenessCapture.Callback callback = new MLLivenessCapture.Callback() {
     @Override
     public void onSuccess(MLLivenessCaptureResult result) {
       //The processing logic of successful detection, and the detection result may be living or non living.
     }
     
     @Override
      public void onFailure(int errorCode) {
       //The detection is not completed, such as camera abnormality_ Error, add failed processing logic.
    }
 };

2.2 create an in vivo detection instance and start the detection.

MLLivenessCapture capture = MLLivenessCapture.getInstance();
capture.startDetect(activity, callback);

Demo effect

The following demo shows the detection results of living detection technology when there are real faces and face masks in front of the camera. Is the effect great?

Github source code

https://github.com/HMS-Core/hms-ml-demo/blob/master/MLKit-Sample/module-body/src/main/java/com/mlkit/sample/activity/HumanLivenessDetectionActivity.java

For more detailed development guidelines, please refer to the official website of Huawei developer alliance

https://developer.huawei.com/consumer/cn/hms/huawei-mlkit

Original link: https://developer.huawei.com/consumer/cn/forum/topicview?tid=0203345286567820416&fid=18
Author: leave leaves

Tags: Android hms-core

Posted by wesmont on Sun, 15 May 2022 15:42:29 +0300