Hey, Mom! The Explanation.

Here's the permanent dedicated link to my first Hey, Mom! post and the explanation of the feature it contains.

Thursday, November 21, 2019

A Sense of Doubt blog post #1738 - Gorillas in the algorithm: Google's Big Mistake


A Sense of Doubt blog post #1738 - Gorillas in the algorithm: Google's Big Mistake

Just outrageous!

The rest of the content from the outstanding newsletter 20 minutes into the Future by Daniel Harvey.

Happy Black History Month. Given recent news I wanted to thread the needle on Google’s own history of mistreatment of black people. First up: 2015—2018: The Gorilla Incident.
Before I get into that detail though I wanted to pause to talk about Black History MonthLonnie Bunch, head of The Smithsonian, says it was
envisioned as a way to counter the invisibility of black people and to challenge the negative imagery and stereotypes that were often the only manner black people were depicted in popular culture and in the media. By emphasising stories of black achievement and resilience, the month would focus a nation’s attention on the positive aspects of black life that was rarely visible.
I want you to hold that idea of invisibility and negative imagery and stereotypes in your head as we get into the details.
Which we’ll do right after I give a word of thanks. This letter and the series it’s in is being brought to you with help. Mike Wamungu and Jason Peart both offered invaluable feedback as I was writing this. 
Ok. Let’s dive in. Are you buckled up?
We’ve all grown accustomed to photo software that “automagically” tags photos we take with likely categories. You take a photo of a car (for example) and ta-da without you lifting a finger it’s been tagged as such so you can find it easily later. Back in 2015 Google released a new Photos app that leveraged this image classification tech as a core feature.
Jacky Alcine, a black software developer, took to Twitter to show the world how biased and fucked up Google’s image recognition algorithms were. It was tagging photos of him and his friend as “gorillas.” This racist as fuck comparison has a long and brutal history that stretches back centuries and has no fucking place in products from “disruptive, liberally-minded” companies.
You could argue, as many inside and outside of Google have, that this is “unintentional.” Gray’s Law suggests in special cases intentionality is beyond the point. It goes: “Any sufficiently advanced incompetence is indistinguishable from malice."
And this a case of massive incompetence. This happened because there weren’t ample people of colour on the engineering and designs teams. Or in the candidate pools with whatever groups Google tested this feature with. This is the inevitable outcome of work done by non-inclusive teams. Racism gets baked in.
We live in a world dominated by technology. We’re never going to get the world we want — a world that celebrates all people and all cultures, a world with social and economic justice for all — until our tech overlords come to terms with their own unconscious bias and racism. If you haven’t already done so  I strongly encourage you to read “So You Wanna Talk About Race” by Ijeoma Oluo.
What followed Alcine’s shocking revelations was the usual pattern when big tech companies behave badly on race:
  • A well-deserved tongue-lashing and brow-beating in the tech media
  • And lots of conversation about training data and inclusion
and then not much else.
The issue seemed to resolve fast enough in the product. No more black people getting tagged as gorillas. We all assumed (shame on us) that Google had used it’s renowned engineering skills to solve the problem.
In 2018 we found out we were all wrong. Instead of actually addressing the issue they had simply removed gorillas from the classification set. The illusion of moving fast and fixing things by way of censorship-as-a-service.
A Google spokesperson confirmed that “gorilla” was censored from searches and image tags after the 2015 incident, and that “chimp,” “chimpanzee,” and “monkey” are also blocked today. “Image labeling technology is still early and unfortunately it’s nowhere near perfect,” the spokesperson wrote in an email, highlighting a feature of Google Photos that allows users to report mistakes.
And of course the usual pattern followed:
  • A well-deserved tongue-lashing and brow-beating in the tech media
  • And lots of conversation about training data and inclusion
and then not much else. Those terms are still missing from the classification set. No real progress is visible to end-users of Google’s products.
And that, by the way, is exactly the sort of first-step you’d expect from an agile tech company. And if you’ve ever worked in an agile tech company you’ll know that first steps are often the only steps actually taken. Why bother with a real fix when a “minimum viable fix” will do? Why bother with a thorough, rigorously tested solution when you can just hide the problem?
That is unless you have more money than God and can hire a raft of contractors to try and fix your mistakes for you…
File under: #biasedalgorithms #stupidAI #surveillance #inclusion
Next week: How Google contractors tried to fix their classification problem in, you guessed it, the most racist way possible.
20 Minutes into the Future is a critical look at how technology is shaping our lives today. And what actions we can take for a better tomorrow. If you found this newsletter worth your while then please sign up for a free subscription.

Daniel Harvey writes 20 Minutes into The Future. He is a product designer and has written for Fast Company, Huffington Post, The Drum, & more. If you're pissed about the current state of tech and want to see us do better then you’ve found a kindred spirit.
You can email him at daniel.harvey@gmail.com or follow him on Twitter @dancharvey.
If you liked this post from 20 Minutes into the Future, why not share it?
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++


- Bloggery committed by chris tower - 1911.21 - 10:10

- Days ago = 1601 days ago

- New note - On 1807.06, I ceased daily transmission of my Hey Mom feature after three years of daily conversations. I plan to continue Hey Mom posts at least twice per week but will continue to post the days since ("Days Ago") count on my blog each day. The blog entry numbering in the title has changed to reflect total Sense of Doubt posts since I began the blog on 0705.04, which include Hey Mom posts, Daily Bowie posts, and Sense of Doubt posts. Hey Mom posts will still be numbered sequentially. New Hey Mom posts will use the same format as all the other Hey Mom posts; all other posts will feature this format seen here.

No comments: