Google —
Challenging marketers to do better with their data.
ML Fairness / Interactive Microsite
the ask
Demystify data bias to show the importance of DE&I in tech.
Google is a champion for change, especially when it comes to offering their unique insights as a tech company to issues of DE&I. So they tackled the topic like only they could… with research. 5 workshops and 44 interviews later, one thing was clear—algorithmic bias is a huge problem in marketing. Since presenting a data problem with more data felt like a numbers turducken, Google needed an innovative new way to visualize their findings for busy marketers.
Services


the insight
Marketers aren’t thinking about data’s role in diversity.
Your campaign is looking good. The art direction feels right, animation’s dialed in, and the copy is really smart. You and your team have had powerful discussions about DE&I efforts in marketing re: casting, direction, photography, ADA compliance. You’ve put in the work and your content reflects that. This should be a smashing success, but there’s just one problem… The data–that guided strategy, targeting, timing—was biased.
the solution
AI-generated portraits that put bias on display.
We created a compelling interactive story that used real people’s experiences to humanize key findings from Google’s study, proving algorithms are only as good as the data they’re using. By leveraging an AI-powered illustration style, we rendered human portraits with code to show marketers’ how skewed data distorts the way they see their customers—literally.

The research came to life through a variety of content types, including audio from DE&I leaders.



Our custom dev tool converted pics into pixel-perfect ASCII art.

THE RESULTS
People were called to action.
Google gave it a spot on the front page of Think with Google, supported it with paid media (special treatment for a TwG article) to make sure marketers got the message, and shared the project widely throughout the company. We’re really proud of what we created for this one, and Google was too.