Skip to main content

FaceUnity

This page explains how to integrate and use the FaceUnity AR Filter extension in your Android or iOS app.

Understand the tech

To quickly integrate FaceUnity's AR filter capabilities, call the Video SDK setExtensionProperty method and pass the corresponding key and value.

Taking setExtensionProperty for example, the key is named after the FaceUnity API, and the value wraps certain or all the parameters of that API in JSON. When you call setExtensionProperty and pass in the pair of key and value, it is equivalent to calling the corresponding FaceUnity API. The case is the same for setExtensionPropertyWithVendor.

Currently, the extension encapsulates part of the APIs of the FaceUnity Nama SDK. For details, see the key overview.

Prerequisites

The development environment requirements are as follows:

  • Android Studio 4.1 or later.
  • A physical device (not an emulator) running Android 5.0 or later.

Activate the extension

You need to activate the extension in Agora Console. Save the certificate file you obtain from activation. The certificate is used in subsequent integration.

Integrate the extension

  1. In Extensions Marketplace page, download the package of FaceUnity AR Filter.
  2. Unzip the package, and save all .aar files to the /app/libs path of your project folder.
  3. Save the certificate file authpack.java to where the app module is located (For example, the package name in the sample code is io.agora.rte.extension.faceunity.example, so the certificate file should be saved to /app/src/main/java/io/agora/rte/extension/faceunity/example).
  4. Contact Agora to get the resource package of the Faceunity extension. Save the model and prop files you need to the /app/src/main/assets path of the project folder. For details, see Resource package structure.
  5. In the app/build.gradle file, add the following line in dependencies:

    _1
    implementation fileTree(dir: "libs", include: ["*.jar", "*.aar"])

  6. Import the required classes:

    _7
    import io.agora.rtc2.Constants;
    _7
    import io.agora.rtc2.IMediaExtensionObserver;
    _7
    import io.agora.rtc2.IRtcEngineEventHandler;
    _7
    import io.agora.rtc2.RtcEngine;
    _7
    import io.agora.rtc2.RtcEngineConfig;
    _7
    import io.agora.rtc2.video.VideoCanvas;
    _7
    import io.agora.rte.extension.faceunity.ExtensionManager;

Call sequence

This section describes the call sequence of using the extension. For a detailed parameter description, see the API Reference.

Step 1: Enable the extension

When initializing RtcEngine, call enableExtension to enable the extension.


_5
private void enableExtension(boolean enabled) {
_5
// Initialize ExtensionManager before calling enableExtension
_5
ExtensionManager.getInstance(mRtcEngine).initialize(this);
_5
mRtcEngine.enableExtension("FaceUnity", "Effect", enabled);
_5
}

Step 2: Initialize the extension

To initialize the extension, call setExtensionProperty (Android), and pass in the corresponding keys and values:

  1. Initialize the extension Set the key as fuSetup, and the value as the pointer to the certificate file.
  2. Load the AI model Set the key as fuLoadAIModelFromPackage, and the value contains the path of the AI capability model file ai_xxx.bundle and the AI capability type.

For example:


_31
private void initExtension() {
_31
// Initialization
_31
try {
_31
JSONObject jsonObject = new JSONObject();
_31
JSONArray jsonArray = new JSONArray();
_31
for (byte it : authpack.A()) {
_31
jsonArray.put(it);
_31
}
_31
jsonObject.put("authdata", jsonArray);
_31
setExtensionProperty("fuSetup", jsonObject.toString());
_31
} catch (JSONException e) {
_31
Log.e(TAG, e.toString());
_31
}
_31
_31
// Load the AI model
_31
File modelDir = new File(getExternalFilesDir("assets"),
_31
"face_unity/model/ai_face_processor.bundle");
_31
try {
_31
JSONObject jsonObject = new JSONObject();
_31
jsonObject.put("data", modelDir.getAbsolutePath());
_31
jsonObject.put("type", 1 << 10);
_31
setExtensionProperty("fuLoadAIModelFromPackage", jsonObject.toString());
_31
} catch (JSONException e) {
_31
Log.e(TAG, e.toString());
_31
}
_31
}
_31
_31
// For the sake of calling setExtensionProperty multiple times
_31
private void setExtensionProperty(String key, String property) {
_31
mRtcEngine.setExtensionProperty("FaceUnity", "Effect", key, property);
_31
}

Step 3: Configure beauty effects and body recognition

Call setExtensionProperty and pass in the corresponding keys and values.

You can implement the following functions:

  • Load props and adjust beautification intensity
  • Recognize and track human faces, gestures, and bodies

You can call the method as needed. For a full list of keys and values, see the API Reference.

Sample project

The complete sample code and project is provided on GitHub:

Run the project

  1. Clone the repository:

_1
git clone https://github.com/AgoraIO-Community/AgoraMarketPlace.git

  1. On the Extensions Marketplace Downloads page, download the Android package of FaceUnity AR Filter. Unzip the package, and save all .aar files to the FaceUnity/android/app/libs path.

  2. Contact Agora to get the certificate file and resource package.

  3. Save the certificate file authpack.java to FaceUnity/android/app/src/main/java/io/Agora/rte/extension/faceunity/example.

  4. Save the required model and prop files from the resource package to FaceUnity/android/app/src/main/assets/face_unity under the project folder.

  5. Open the sample project FaceUnity/android in Android Studio.

  6. Sync the project with Gradle files.

  7. Open the FaceUnity/android/app/src/main/java/io/Agora/rte/extension/FaceUnity/example/Config.java file, and replace <YOUR_APP_ID> with your App ID. To get an App ID, see Getting Started with Agora.


    _4
    public interface Config {
    _4
    String mAppId = "<YOUR_APP_ID>";
    _4
    String mToken = null;
    _4
    }

  8. Connect an Android device (not an emulator), and run the project.

Reference

Resource package structure

API reference

Video SDK

This section lists the APIs related to using extensions with the Agora SDK.

FaceUnity key-value overview

The key corresponds to the name of the FaceUnity API, and the value corresponds to the parameters of the FaceUnity API. In this section, if the value is the same as the parameters of the FaceUnity API, the link leads to the FaceUnity documentation. If the value is different from the parameters of the FaceUnity API, the link leads to subsequent sections on this page.

Method keys

Method keys refer to the keys you pass in when calling the setExtensionProperty/setExtensionPropertyWithVendor method of the Agora SDK.

Initialization
Method keysDescription
fuSetupInitialize the extension and authenticate the user. Must be executed before other keys. Otherwise, a crash can happen.
fuLoadAIModelFromPackagePreload AI capabilities.
fuReleaseAIModelFree up resources occupied by AI capabilities.
Prop package loading
Method keysDescription
fuCreateItemFromPackageLoads the prop package.
fuLoadTongueModelLoads tongue detection data.
fuItemSetParamModifies or sets the value of a variable in the prop package.
Destruction
Method keysDescription
fuDestroyItemDestroys a specified item.
fuDestroyAllItemsDestroys all loaded items and releases all occupied resources.
fuOnDeviceLostResets the system's GL state. Use this key when the OpenGL context is released/destroyed by external resources.
fuDestroyLibDataFrees up the memory allocated to the face tracking module after calling fuSetup.
System functions
Method keysDescription
fuBindItemsBinds resource items to a target item.
fuUnbindItemsUnbinds the resource items from a target item.
fuIsTrackingSets whether to get the number of faces being tracked.
fuSetMaxFacesSets the maximum number of tarcked faces.
fuSetDefaultRotationModeSets the default human face orientation.
Algorithm functions
Method keysDescription
fuFaceProcessorSetMinFaceRatioSets the distance of face detection.
fuSetTrackFaceAITypeSets the fuTrackFace algorithm type.
fuSetFaceProcessorFovSets the fov (equivalent to focal length) of the FaceProcessor algorithm module.
fuHumanProcessorResetResets the state of the HumanProcessor algorithm module.
fuHumanProcessorSetMaxHumansSets the number of bodies tracked by the HumanProcessor algorithm module.
fuHumanProcessorGetNumResultsSets whether to get the number of bodies tracked by the HumanProcessor algorithm module.
fuHumanProcessorSetFovSets the fov (equivalent to focal length) used by the HumanProcessor algorithm module to track 3D key points on human bodies.
fuHandDetectorGetResultNumHandsSets whether to get the number of gestures tracked by the HandGesture algorithm module. Note that ai_gesture.bundle needs to be loaded.
Callback keys

Callback keys refer to the keys returned in the onEvent callback of the Agora SDK.

Callback keysDescription
fuIsTrackingReturns the number of faces being tracked.
fuHumanProcessorGetNumResultsReturns the number of human bodies tracked by the HumanProcessor algorithm module.
fuHandDetectorGetResultNumHandsReturns the number of gestures tracked by the HandGesture algorithm module.

Method key description

fuSetup

The value contains the following parameters:

Value parametersDescription
authdataThe path to the certificate file.
fuLoadAIModelFromPackage

The value contains the following parameters:

Value parametersDescription
dataString. The path of the AI capability model file ai_xxx.bundle. Such model files are located in the assets/AI_Model directory of the resource package.
typeInt. The AI capability type corresponding to the bundle file. Possible values are listed in enum FUAITYPE.
fuCreateItemFromPackage

The value contains the following parameters:

Value parametersDescription
dataString. The path to the prop package you want to load. A prop package usually has a suffix *.bundle .
fuLoadTongueModel

The value contains the following parameters:

Value parametersDescription
dataString. The path of tongue model data tongue.bundle.
fuItemSetParam

The value contains the following parameters:

Value parametersDescription
obj_handleString. The path of the prop package passed in when calling fuCreateItemFromPackage.
name``String. The name of the variable to set in the prop package.
valueObject. The variable value to be set.

For details on the variable names and values in the prop package, refer to the FaceUnity documentation.

fuDestroyItem

The value contains the following parameters:

Value parametersDescription
itemString. The path of the prop package passed in when calling fuCreateItemFromPackage.
fuBindItems

The value contains the following parameters:

Value parametersDescription
obj_handleString. The path of the target item.
p_itemsString Array. The paths to the resource items you want to bind.
fuUnbindItems

The value contains the following parameters:

Value parametersDescription
obj_handleString. The path of the target item.
p_itemsString Array, the paths to the resource items you want to unbind.
fuIsTracking

The value contains the following parameters:

Value parametersDescription
enableBool. Whether to get the number of faces being tracked. If set to true, you can receive the fuIsTracking callback.
fuHumanProcessorGetNumResults

The value contains the following parameters:

Value parametersDescription
enableBool. Whether to get the number of human bodies tracked by the HumanProcessor algorithm module. If set to true, you can receive the fuHumanProcessorGetNumResults callback.
fuHandDetectorGetResultNumHands

The value contains the following parameters:

Value parametersDescription
enableBool. Whether to get the number of gestures tracked by the HandGesture algorithm module. If set to true, you can receive the fuHandDetectorGetResultNumHands callback.
Callback key description
fuIsTracking

The value contains the following parameters:

Value parametersDescription
facesInt. The number of faces being tracked.
fuHumanProcessorGetNumResults

The value contains the following parameters:

Value parametersDescription
peopleInt. The number of bodies being tracked.
fuHandDetectorGetResultNumHands

The value contains the following parameters:

Value parametersDescription
handsInt. The number of gestures being tracked.

Page Content