The Lighting Estimation API analyzes a given image for discrete visual cues and provides detailed information about the lighting in a given scene. You can then use this information when rendering virtual objects to light them under the same conditions as the scene they're placed in, making these objects feel more realistic and enhancing the immersive experience for users.
Lighting cues and concepts
Humans unconsciously perceive subtle cues regarding how objects or living things are lit in their environment. When a virtual object is missing a shadow or has a shiny material that doesn't reflect the surrounding space, users can sense the object doesn't quite fit into a particular scene even if they can't explain why. This is why rendering AR objects to match the lighting in a scene is crucial for immersive and more realistic experiences.
Lighting Estimation does most of the work for you by providing detailed data that lets you mimic various lighting cues when rendering virtual objects. These cues are shadows, ambient light, shading, specular highlights, and reflections.
We can describe these visual cues like this:
Ambient light. Ambient light is the overall diffuse light that comes in from around the environment, lighting everything.
Shadows. Shadows are often directional and tell viewers where light sources are coming from.
Shading. Shading is the intensity of the light in different areas of a given image. For example, different parts of the same object can have different levels of shading in the same scene depending on angle relative to the viewer, and its proximity to a light source.
Specular highlights. These are the shiny bits of surfaces that reflect a light source directly. Highlights on an object change relative to the position of a viewer in a scene.
Reflection. Light bounces off of surfaces differently depending on whether the surface has specular (that is, highly reflective) or diffuse (not reflective) properties. For example, a metallic ball will be highly specular and reflect its environment, while another ball painted a dull matte grey will be diffuse. Most real-world objects have a combination of these properties -- think of a scuffed-up bowling ball or a well-used credit card.
Reflective surfaces also pick up colors from the ambient environment. The coloring of an object can be directly affected by the coloring of its environment. For example, a white ball in a blue room will take on a bluish hue.
Using Lighting Estimation modes to enhance realism
The Config.LightEstimationMode
API has
modes that estimate lighting in the environment with different degrees of
granularity and realism.
Environmental HDR mode (
ENVIRONMENTAL_HDR
). This mode consists of an API that allows realistic lighting estimation for directional lighting, shadows, specular highlights, and reflections.Ambient Intensity mode (
AMBIENT_INTENSITY
). This mode determines the average pixel intensity and the color of the lighting for a given image. It's a coarse setting designed for use cases in which precise lighting is not critical, such as objects that have baked-in lighting.DISABLED
. DisableConfig.LightEstimationMode
if lighting to match a given environment is not relevant for a scene or an object.
Using ENVIRONMENTAL_HDR
mode
ENVIRONMENTAL_HDR
mode uses machine learning to analyze the input
camera image and synthesize environmental lighting for rendering a virtual
object.
This mode combines directional lighting, ambient spherical harmonics, and an HDR cubemap to make virtual objects feel like they're physically part of a given scene:
Directional lighting analyzes the apparent light source for a given image. This kind of lighting adds reasonably positioned specular highlights, and casts shadows in a direction consistent with other visible real objects.
Ambient spherical harmonics get a realistic representation of the overall ambient light coming in from all directions in a scene. During rendering, this information is used to add subtle cues that bring out the definition of virtual objects.
An HDR cubemap captures the environmental lighting surrounding the virtual object. During rendering, this cubemap creates the reflection for the medium to high glossiness material.
The following image shows an example of a virtual object placed in a scene
with ENVIRONMENTAL_HDR
enabled.
Configure ENVIRONMENTAL_HDR
mode for a Sceneform scene
To use ENVIRONMENTAL_HDR
with a Sceneform scene, extend the ARFragment
class, and override the configuration as follows:
@Override
protected Config getSessionConfiguration(Session session) {
Config config = new Config(session);
config.setLightEstimationMode(Config.LightEstimationMode.ENVIRONMENTAL_HDR);
return config;
}
To see an example of how this works, see the Solar System sample. (This sample implements
ENVIRONMENTAL_HDR
without using ARFragment
.)
Using AMBIENT_INTENSITY
mode
AMBIENT_INTENSITY
mode determines the average pixel intensity and the
color correction scalars for a given image. It's a coarse setting designed for
use cases in which precise lighting is not critical, such as objects that have
baked-in lighting.
Pixel intensity captures the average pixel intensity of the lighting in a scene, for applying to a whole virtual object.
Color correction scalars detect the white balance for each individual frame, and allows you to color correct a virtual object so that it integrates more smoothly into the overall coloring of a scene.
Configure AMBIENT_INTENSITY
mode for a Sceneform scene
To use AMBIENT_INTENSITY
with a Sceneform scene, extend the ARfragment
class, and override the configuration as follows:
@Override
protected Config getSessionConfiguration(Session session) {
Config config = new Config(session);
config.setLightEstimationMode(Config.LightEstimationMode.AMBIENT_INTENSITY);
return config;
}