- Sceneform SDK for Android
was open sourced and archived ( github.com/google-ar/sceneform-android-sdk
) with version 1.16.0.
- This site ( developers.google.com/sceneform
) serves as the documentation archive for the previous version, Sceneform SDK for Android
1.15.0.
- Do not use version 1.17.0 of the Sceneform Maven artifacts
.
- The 1.17.1 Maven artifacts can be used. Other than the version, however, the 1.17.1 artifacts are identical to the 1.15.0 artifacts.
Augmented Faces developer guide for Sceneform
Stay organized with collections
Save and categorize content based on your preferences.
Learn how to use the Augmented Faces feature in your own apps.
Build and run the sample app
To build and run the AugmentedFaces Javaapp:
-
Open Android Studio version 3.1 or greater. It is recommended to use a
physical device (and not the Android Emulator) to work with Augmented Faces.
The device should be connected to the development machine via USB. See the Android quickstart
for detailed steps.
-
Import the AugmentedFaces Java sample
into your project.
-
In Android Studio, click Run
. Then,
choose your device as the deployment target and click OKto launch the
sample app on your device.
-
Click Approveto give the camera access to the sample app.
The app should open the front camera and immediately track your face in the
camera feed. It should place images of fox ears over both sides of your
forehead, and place a fox nose over your own nose.
-
Import assets into Sceneform
-
Configure the ARCore session
-
Get access to the detected face
-
Render the effect on the detected face
Import assets into Sceneform
Make sure that assets you use for Augmented Faces are scaled and positioned
correctly. For tips and practices, refer to Creating Assets for Augmented Faces
.
To apply assets such as textures and 3D models to an augmented face mesh in Sceneform,
first import the assets.
At runtime, use ModelRenderable.Builder
to load the *.sfb
models, and use the Texture.Builder
to load a texture for the face.
// To ensure that the asset doesn't cast or receive shadows in the scene,
// ensure that setShadowCaster and setShadowReceiver are both set to false.
ModelRenderable
.
builder
()
.
setSource
(
this
,
R
.
raw
.
fox_face
)
.
build
()
.
thenAccept
(
modelRenderable
-
>
{
faceRegionsRenderable
=
modelRenderable
;
modelRenderable
.
setShadowCaster
(
false
);
modelRenderable
.
setShadowReceiver
(
false
);
});
// Load the face mesh texture.
Texture
.
builder
()
.
setSource
(
this
,
R
.
drawable
.
fox_face_mesh_texture
)
.
build
()
.
thenAccept
(
texture
-
>
faceMeshTexture
=
texture
);
Face mesh orientation
Note the orientation of the face mesh for Sceneform:
Augmented Faces requires the ARCore session to be configured to use the
front-facing (selfie) camera and enable face mesh support. To do this in
Sceneform, extend the ARfragment
class, and override the configuration:
@Override
protected
Set<Session
.
Feature
>
getSessionFeatures
()
{
return
EnumSet
.
of
(
Session
.
Feature
.
FRONT_CAMERA
);
}
@Override
protected
Config
getSessionConfiguration
(
Session
session
)
{
Config
config
=
new
Config
(
session
);
config
.
setAugmentedFaceMode
(
AugmentedFaceMode
.
MESH3D
);
return
config
;
}
Refer to this subclassed ArFragment
class in your activity layout.
Get access to the detected face
The AugmentedFace
class extends the Trackable
class. In your app's activity,
use AugmentedFace
to get access to the detected face by calling it from the addOnUpdateListener()
method.
// Get list of detected faces.
Collection<AugmentedFace>
faceList
=
session
.
getAllTrackables
(
AugmentedFace
.
class
);
Render the effect for the face
Rendering the effect involves these steps:
for (AugmentedFace face : faceList) {
// Create a face node and add it to the scene.
AugmentedFaceNode faceNode = new AugmentedFaceNode(face);
faceNode.setParent(scene);
// Overlay the 3D assets on the face.
faceNode.setFaceRegionsRenderable(faceRegionsRenderable);
// Overlay a texture on the face.
faceNode.setFaceMeshTexture(faceMeshTexture);
…
}
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License
, and code samples are licensed under the Apache 2.0 License
. For details, see the Google Developers Site Policies
. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2024-06-26 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-06-26 UTC."],[[["\u003cp\u003eThis guide explains how to utilize the Augmented Faces feature in your Android applications to overlay assets and textures onto detected faces in real-time using Sceneform and ARCore.\u003c/p\u003e\n"],["\u003cp\u003eIt involves importing relevant assets, configuring the ARCore session for front-facing camera and face mesh support, accessing the detected face data, and rendering the desired effects by attaching nodes to the face.\u003c/p\u003e\n"],["\u003cp\u003eThe guide provides code samples for key steps like asset loading, session configuration, and rendering, emphasizing correct asset scaling and positioning for optimal results.\u003c/p\u003e\n"],["\u003cp\u003eIt highlights the importance of understanding the face mesh orientation within Sceneform and using the \u003ccode\u003eAugmentedFace\u003c/code\u003e class to access and manipulate the detected face data.\u003c/p\u003e\n"],["\u003cp\u003eDevelopers can build and run a provided sample app showcasing the Augmented Faces functionality with a fox-themed overlay before integrating the feature into their own projects.\u003c/p\u003e\n"]]],["To use Augmented Faces, first, build and run the sample app by importing it into Android Studio, running it on a physical device, and granting camera access. Then, import assets, ensuring they are correctly scaled and positioned. Configure the ARCore session to use the front camera and enable 3D face mesh support. Access detected faces using the `AugmentedFace` class within the `addOnUpdateListener` method. Finally, render the effect by creating an `AugmentedFaceNode`, setting its parent to the scene, overlaying 3D assets and a texture onto the face.\n"],null,["# Augmented Faces developer guide for Sceneform\n\nLearn how to use the Augmented Faces feature in your own apps.\n\nBuild and run the sample app\n----------------------------\n\nTo build and run the **AugmentedFaces Java** app:\n\n1. Open Android Studio version 3.1 or greater. It is recommended to use a\n physical device (and not the Android Emulator) to work with Augmented Faces.\n The device should be connected to the development machine via USB. See the\n [Android quickstart](/sceneform/develop/android-quickstart) for detailed steps.\n\n2. Import the [AugmentedFaces Java sample](//github.com/google-ar/sceneform-android-sdk/tree/v1.15.0/samples/augmentedfaces)\n into your project.\n\n3. In Android Studio, click **Run** . Then,\n choose your device as the deployment target and click **OK** to launch the\n sample app on your device.\n\n4. Click **Approve** to give the camera access to the sample app.\n\n The app should open the front camera and immediately track your face in the\n camera feed. It should place images of fox ears over both sides of your\n forehead, and place a fox nose over your own nose.\n\nUsing Augmented Faces in Sceneform\n----------------------------------\n\n1. [Import assets into Sceneform](#import-assets)\n\n2. [Configure the ARCore session](#configure-session)\n\n3. [Get access to the detected face](#get-access)\n\n4. [Render the effect on the detected face](#render-effect)\n\n### Import assets into Sceneform\n\nMake sure that assets you use for Augmented Faces are scaled and positioned\ncorrectly. For tips and practices, refer to [Creating Assets for Augmented Faces](/sceneform/develop/augmented-faces/creating-assets).\n\nTo apply assets such as textures and 3D models to an augmented face mesh in Sceneform,\nfirst import the assets.\n\nAt runtime, use [`ModelRenderable.Builder`](/sceneform/reference/com/google/ar/sceneform/rendering/ModelRenderable.Builder)\nto load the `*.sfb` models, and use the [`Texture.Builder`](/sceneform/reference/com/google/ar/sceneform/rendering/Texture.Builder)\nto load a texture for the face. \n\n // To ensure that the asset doesn't cast or receive shadows in the scene,\n // ensure that setShadowCaster and setShadowReceiver are both set to false.\n ModelRenderable.builder()\n .setSource(this, R.raw.fox_face)\n .build()\n .thenAccept(\n modelRenderable -\u003e {\n faceRegionsRenderable = modelRenderable;\n modelRenderable.setShadowCaster(false);\n modelRenderable.setShadowReceiver(false);\n });\n\n // Load the face mesh texture.\n Texture.builder()\n .setSource(this, R.drawable.fox_face_mesh_texture)\n .build()\n .thenAccept(texture -\u003e faceMeshTexture = texture);\n\n### Face mesh orientation\n\nNote the orientation of the face mesh for Sceneform:\n\n### Configure the ARCore session\n\nAugmented Faces requires the ARCore session to be configured to use the\nfront-facing (selfie) camera and enable face mesh support. To do this in\nSceneform, extend the [ARfragment](/sceneform/reference/com/google/ar/sceneform/ux/ArFragment)\nclass, and override the configuration: \n\n @Override\n protected Set\u003cSession.Feature\u003e getSessionFeatures() {\n return EnumSet.of(Session.Feature.FRONT_CAMERA);\n }\n\n @Override\n protected Config getSessionConfiguration(Session session) {\n Config config = new Config(session);\n config.setAugmentedFaceMode(AugmentedFaceMode.MESH3D);\n return config;\n }\n\nRefer to this subclassed `ArFragment` class in your activity layout.\n\n### Get access to the detected face\n\nThe `AugmentedFace` class extends the `Trackable` class. In your app's activity,\nuse `AugmentedFace` to get access to the detected face by calling it from the [`addOnUpdateListener()`](/sceneform/reference/com/google/ar/sceneform/Scene#public-void-addonupdatelistener-scene.onupdatelistener-onupdatelistener)\nmethod. \n\n // Get list of detected faces.\n Collection\u003cAugmentedFace\u003e faceList = session.getAllTrackables(AugmentedFace.class);\n\n### Render the effect for the face\n\nRendering the effect involves these steps: \n\n for (AugmentedFace face : faceList) {\n // Create a face node and add it to the scene.\n AugmentedFaceNode faceNode = new AugmentedFaceNode(face);\n faceNode.setParent(scene);\n\n // Overlay the 3D assets on the face.\n faceNode.setFaceRegionsRenderable(faceRegionsRenderable);\n\n // Overlay a texture on the face.\n faceNode.setFaceMeshTexture(faceMeshTexture);\n\n ...\n }"]]