Historically, racial bias in camera technology has overlooked and excluded people of colour, leading to unfair experiences like overbrightening or unnaturally desaturating skin. Our teams are on a mission to build products that work equitably for everyone, so that people from all racial and ethnic groups can enjoy beautiful, representative, and accurate photos and images.
To make Pixel a more inclusive and equitable camera, we partnered with a diverse range of renowned image makers who are celebrated for their depictions of communities of colour. Together, we significantly increased the number of portraits of people of colour in the image datasets that train our camera models. This feedback helped us make the key improvements across our face detection, camera, and editing products that we call Real Tone.
Our virtual try-on tool in Google Search also reflects our commitment to making imaging technology more inclusive. It can take just one clothing image and accurately show how it would drape, fold, cling, stretch, and form wrinkles and shadows on a diverse set of real people in various poses. To train the generative AI model that powers this feature, we used images of a range of people representing different skin tones, body shapes, ethnicities, and hair types.