Another clear example of this is when a model lifted a surgical mask to her face—her skin darkens more in the Pixel 8 Pro video, whereas there isn’t as much of a swing on the Pixel 9 Pro. This also affects situations where you have multiple people of different skin tones together, and Koenigsberger says there should be fewer distortions in exposure.
Analyzing photos captured at the scene, it wasn’t hard to discern the updates to the algorithm, especially with the luxury of having the models right in front of me. Even in normal lighting conditions, skin tones on the Pixel 9 Pro had a much closer match to the people in real life, in my eyes, over the Pixel 8 Pro. Koenigsberger says these are also due to broad changes in Google’s HDR+ imaging pipeline (more on this later), which enables the system to produce more accurate shadows and mid-tones.
Another new change is also auto-white balance segmentation, and this process allows for separate auto-white balance exposures for people in the picture from their background. Before, you may have noticed some color seeping in from a setting, like blue skies producing a cooler tone to the skin. This new system helps “people stay the way that they should look, separate from the background,” Koenigsberger says.
Portrait taken on the Google Pixel 8
Portrait taken on the Google Pixel 9
This year’s Pixel 9 series is also the first time that Google’s skin tone classifier fully aligns with the Monk Skin Tone Scale, a 10-shade scale released to the public that represents a broad range of skin tones, meant to help with all kinds of uses from computational photography to health care. Koenigsberger says change allows for much more fine-tuned color adjustments.
Arguably most important is the fact that Real Tone for the first time has been tested for all of Google’s “Hero” features across the Pixel 9 range. Koenigsberger says his team has been able to scale up testing to ensure new features like Add Me have been tested for Real Tone before launch. That’s important because Koenigsberger says his team isn’t always able to spend as much time testing on the A-series Pixel phones, which might be why I had some issues with Real Tone on the Pixel 8A. Scaling this process up will hopefully help, but Koenigsberger says it brings Real Tone from a specific set of technologies into Google’s operating philosophy.
“Ultimately, this is going to be someone’s memory,” Koenigsberger says. “It’s going to be their experience with their family; it’s going to be that trip with their best friend—as close as we can get to recreating those experiences when we’re testing, I think the more reliably we’re going to get people something that they’re happy with.”
Artificial Memories
Memories are the underlying theme driving many of the new features from Google’s camera team. Earlier in the day, I sat down with Isaac Reynolds, the group product manager for the Pixel Camera, which he’s been a part of since 2015 with the launch of the first Pixel phone. Nearing his 10th anniversary, Reynolds says he’s probably “more enthused than many others” about mobile photography, believing there’s still so much space to advance cameras. “I see the memories people can’t capture because of technical limitations.”
New camera features in Pixel phones increasingly focus on specific instances rather than broad strokes changes to the general camera experience, though Reynolds says the HDR+ pipeline has been rebuilt in the Pixel 9 series. It retunes the exposure, sharpening, contrast, and shadows merging—plus all of the updates to Real Tone—which help create a more “authentic” and “natural” image, according to Reynolds. He suggests it’s what people prefer compared to the more processed, punchy, and heavily filtered images that were so popular a decade ago.