4

How 10 Skin Tones Will Reshape Google’s Approach to AI

 1 year ago
source link: https://www.wired.com/story/google-monk-skin-tone-scale-computer-vision-bias/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

How 10 Skin Tones Will Reshape Google’s Approach to AI

For years, the tech industry has relied on a six-shade scale to classify skin tones. The search giant’s open source alternative could change that.
Gradient of foundation shades
Photograph: VPanteon/Getty Images

For years, tech companies have relied on something called the Fitzpatrick scale to classify skin tones for their computer vision algorithms. Originally designed for dermatologists in the 1970s, the system comprises only six skin tones, a possible contributor to AI’s well-documented failures in identifying people of color. Now Google is beginning to incorporate a 10-skin tone standard across its products, called the Monk Skin Tone (MST) scale, from Google Search Images to Google Photos and beyond. The development has the potential to reduce bias in data sets used to train AI in everything from health care to content moderation.

Google first signaled plans to go beyond the Fitzpatrick scale last year. Internally, the project dates back to a summer 2020 effort by four Black women at Google to make AI “work better for people of color,” according to a Twitter thread from Xango Eyeé, a responsible AI product manager at the company. At today’s Google I/O conference, the company detailed how wide an impact the new system could have across its many products. Google will also open source the MST, meaning it could replace Fitzpatrick as the industry standard for evaluating the fairness of cameras and computer vision systems.

“Think anywhere there are images of people’s faces being used where we need to test the algorithm for fairness,” says Eyeé.

The Monk Skin Tone scale is named after Ellis Monk, a Harvard University sociologist who has spent decades researching colorism’s impact on the lives of Black people in the United States. Monk created the scale in 2019 and worked with Google engineers and researchers to incorporate it into the company’s product development.

“The reality is that life chances, opportunities, all these things are very much tied to your phenotypical makeup,” Monk said in prepared remarks in a video shown at I/O. “We can weed out these biases in our technology from a really early stage and make sure the technology we have works equally well across all skin tones. I think this is a huge step forward.”

An initial analysis by Monk and Google research scientists last year found that participants felt better represented by the MST than by the Fitzpatrick scale. In an FAQ published Wednesday, Google says that having more than 10 skin tones can add complexity without extra value, unlike industries like makeup, where companies like Rihanna’s Fenty Beauty offer more than 40 shades. Google is continuing work to validate the Monk Skin Tone scale in places like Brazil, India, Mexico, and Nigeria, according to a source familiar with the matter. Further details are expected soon in an academic research article.

The company will now expand its use of the MST. Google Images will offer an option to sort makeup-related search results by skin tone based on the scale, and filters for people with more melanin are coming to Google Photos later this month. Should Google adopt the 10-skin-tone scale across its product lines, it could have implications for fairly evaluating algorithms used in Google search results, Pixel smartphones, YouTube classification algorithms, Waymo self-driving cars, and more.

Colorism encoded into technology can lead to undignified outcomes for people with dark skin, such as Google Photos mislabeling pictures of Black people as gorillas, racist soap dispensers, and automatically generated stereotypical images. An algorithm that Google developed to identify lesions lacked inclusion for people with dark skin. Autonomous driving systems have been found to identify people with dark skin much less reliably than those with white skin. Most famously, a 2018 research paper coauthored by former Ethical AI team colead Timnit Gebru concluded that facial recognition algorithms made by major companies performed worse on women with dark skin, work detailed in the documentary Coded Bias.

In the wake of Google firing Gebru in late 2020, Black in AI and Queer in AI groups pledged to no longer receive funds from Google, and the company’s 2021 Diversity Report found that its attrition rates are highest among Black and Native American women.

See What’s Next in Tech With the Fast Forward Newsletter

From artificial intelligence and self-driving cars to transformed cities and new startups, sign up for the latest news.
By signing up you agree to our User Agreement (including the class action waiver and arbitration provisions), our Privacy Policy & Cookie Statement and to receive marketing and account-related emails from WIRED. You can unsubscribe at any time.

Eyeé says further study is needed to validate results that indicate a Monk over Fitzpatrick preference, or whether a Monk approach leads to more equitable algorithms for dermatologists. But early results, especially for groups poorly represented in computer vision data sets, are promising.

Content

This content can also be viewed on the site it originates from.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK