Google Built the Pixel 6 Camera to Better Portray People With Darker Skin Tones. Does It?

Google’s artificial intelligence and machine-learning algorithms have in the past been criticized for how they deal with darker skin tones, including mistakenly tagging photos of Black people as gorillas. The company apologized and said it would fix its software. Now, it’s using AI to power what it calls “the world’s most inclusive camera.”

The goal of the Real Tone image processing in Google’s new Pixel 6 and Pixel 6 Pro smartphones is to “more accurately highlight the nuances of diverse skin tones,” especially darker complexions, according to the company’s website. The new phones, which begin shipping Thursday, are the first to come with Real Tone.

We used our Pixel 6 review unit, plus two other competing smartphones from

Apple Inc.

and

Samsung Electronics Co.

, to test the feature with a group of camera-ready participants. The results were surprising.

The Pixels’ cameras, housed in the prominent bump on the back of the phone, include a suite of new updates. A new sensor lets in 150% more light. New editing tools simulate Photoshop expertise, including a Motion mode for action shots and Magic Eraser, which can delete people and objects from scenes. (We used the feature to remove some wedding guests’ heads in a photo of a bride walking down the aisle.)

The Pixel 6’s Motion mode uses on-device machine learning to identify the subject as well as the direction of the action, and adds a blur effect.



Photo:

Nicole Nguyen/The Wall Street Journal

The Pixel has long been full of AI-driven tricks capable of photographic wizardry. This time, though, Google says it wants to fight what many describe as racial bias in camera technology.

“If I was taking iPhone photos out with friends, I would come out super dark, especially in groups,” said

Mark Clennon,

a photographer whose picture of a Black man standing in front of Trump Tower was one of the more memorable images of last year’s Black Lives Matter demonstrations. “If I’m the only Black person in the photo, I would largely go missing or get lost in the shadows.”

“When photography was invented and developed, it was almost entirely used in the West,” said

John Edwin Mason,

associate professor of African history and the history of photography at the University of Virginia. “The technology and the practical techniques that photographers developed were to capture white people’s skin tones.”

He added, “Film and digital technology both are still biased towards doing justice to white skin tones, with Black skin tones being an afterthought.”

Fine Tuning

Starting in early 2020, a team at Google began adding more images of people of color to the databases training the Pixel camera, including its face detection, said

Florian Koenigsberger,

who is leading Google’s camera and image diversity efforts. The company also enlisted photographers and other experts to provide feedback on optimal white-balance and exposure adjustments for darker skin, especially in challenging lighting conditions, in photos and videos.

Jasmine Hersey of Fremont, Calif., preferred the Pixel 6 photo over the iPhone 13 Mini’s image. ‘This depicts exactly what I look like in the mirror,’ she said.



Photo:

Nicole Nguyen/The Wall Street Journal

We showed a set of pictures we took of

Jasmine Hersey,

an events operations director from Fremont, Calif., to photographers for their take on which settings Google’s AI was likely tweaking. The images were backlit, which tends to make subjects appear unnaturally dark and washed out.

“It looks like they’ve compensated for the shadows,” said photographer

Davey Adesida,

looking at a Pixel 6 photo of Ms. Hersey. “When you account for shadows, the picture brightens.” Mr. Adesida said the Pixel did a slightly better job than the iPhone in capturing Ms. Hersey’s skin. “I’m all up for these changes, but why didn’t they think about this until now?” he said.

In the Pixel 6 and Pixel 6 Pro, Real Tone is an always-on feature, not something you have to remember to turn on. It is also launching as an improvement to Google Photos’ auto-enhance tool in its iOS and Android apps, for any photo uploaded to the service, in the coming weeks.

SHARE YOUR THOUGHTS

What are the biggest problems you’ve had with your phone’s camera? Join the conversation below.

“For the most part, smartphones do a great job of recognizing darker skin tones,” said

Dario Calmese,

the first Black photographer to shoot a cover for Vanity Fair magazine. “However, we are always trying to adjust, rig or hack these systems to make them work for us. There’s usually a couple of extra steps.”

Spokeswomen for Apple and Samsung say their companies, too, look at a variety of skin tones to develop phone camera systems.

In June,

Adobe Inc.

released preset filters for its photo apps to better handle images of people with deep skin tones.

Snap Inc.

is working on a similar project to optimize its camera for darker complexions, and has an integration with the Pixel 6 and Pixel 6 Pro that makes the Snapchat camera accessible from the phones’ lock screens.

Real Tone in Real Life

Google, a subsidiary of

Alphabet Inc.,

says it aims to “represent all people and skin tones beautifully and accurately.” Curious about how people with a variety of skin tones would feel about the images, we asked 18 visitors to San Francisco’s Fisherman Wharf for permission to take their picture, then posed a question: Which photo of yourself do you like best?

We took their photos outside on a partly cloudy day, with the normal wide-angle lens on three phones—Google’s Pixel 6, Samsung’s Galaxy S21 and Apple’s iPhone 13 Mini. The goal was to capture each phone’s default image, the one you get when you hit the shutter button and do nothing else.

Each phone produced a distinct look. Most people surveyed agreed that the Pixel-shot photo most accurately represented their skin tones. However, all but three people preferred how they looked in the Samsung images.

Samsung, the leading smartphone maker world-wide, automatically adds skin-smoothing effects, and tends to produce brighter, more color-saturated photos. The iPhone and Pixel photos, while more detailed, tend to look more muted in comparison.

A Samsung spokeswoman said the company uses trends from survey data and research from a team of color scientists to develop its camera.

Jamaul Butts, right, and Summer Butts of Tampa, Fla., preferred the Samsung image. Mr. Butts said it looked the most ‘clean’ but his skin in the Google image was more natural. ‘In photos, I often look dim and have to brighten and edit everything.’



Photo:

Nicole Nguyen/The Wall Street Journal

Arely Garcia of Davis, Calif., right, liked the Pixel photo more than the Samsung image, which she said looked ‘too edited.’ Reginea Jackson preferred the Samsung shot. She said, ‘The default for most people with darker skin tones is to make photos black and white so they don’t have to deal with getting the right skin tone. But I like color!’



Photo:

Nicole Nguyen/The Wall Street Journal

An Apple spokeswoman said the company improved skin-tone rendering with the iPhone 13, especially for darker skin tones, and that its images of people with darker complexions look more natural across different lighting conditions. She said Apple is committed to making more improvements over time, and hopes the industry continues to improve as well.

Carlo Steven Catabay of Livermore, Calif., left, and Michelle Bayaua of Antioch, Calif., said the iPhone and Pixel 6 captured sharper images, but preferred Samsung’s ‘softer’ look. ‘On social media, people edit themselves anyway,’ said Ms. Bayaua.



Photo:

Nicole Nguyen/The Wall Street Journal

Mohamed Sonko, left, and Susan Sonko of Atlanta both preferred the Pixel image, because it looked balanced ‘without any lighting or special effects.’ Mr. Sonko added, ‘The quality of the new cameras is getting better. But the other day, when the sun went down, we came out a lot darker than the light-skinned people in our group.’



Photo:

Nicole Nguyen/The Wall Street Journal

Our survey can’t assess how people of every shade look in all lighting conditions. But it shows that each phone renders skin tones differently—and that there’s often a difference between accurate representation and preferred appearance.

Historically, Google’s Pixel phones have made up a fraction of smartphones sold—less than 2% in North America. The company says it’s finally serious about selling the phones, which have bigger, better screens than their predecessors plus a Google-designed chipset optimized for the Pixel’s software.

Google’s phones might not be big sellers, but they are influential: The Pixel line is a showcase for Android, which has more than 70% market share world-wide. Cameras play a big role in people’s phone choice, but the marketing of phone cameras traditionally focused on the specs. In launching Real Tone, Google shines a spotlight on an important way the largely white tech industry can try to better serve users of color.

—For more WSJ Technology analysis, reviews, advice and headlines, sign up for our weekly newsletter.

Write to Nicole Nguyen at [email protected] and Dalvin Brown at [email protected]

Corrections & Amplifications
Nicole Nguyen of The Wall Street Journal is credited for the blur effect photo. An earlier version of this article incorrectly carried a credit for Google. (Corrected on Oct. 25)

Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.