Blog Post

HP and Nikon Diversity Fail

 

Within two months of each other, there have been two well-reported incidents of computer technologies not “recognizing” People of Color.  The first incident, involving an HP Pavilion laptop and its face-tracking software, reportedly does not recognize some black people’s faces.  In a video uploaded to YouTube in early December, a black man and white woman both demonstrate how the smart technology in the computer “recognizes” the woman’s face and follows her movements as it’s supposed to, but immediately stops working when the black man switches places with her.  Time magazine just reported on 22 Jan ’10, that the Nikon Coolpix S630 digital camera had difficulty recognizing "Asian" eyes.  After a photo was taken by a Taiwanese-American blogger Joz Wang (aka “Jozjozjoz,”) a message “Did someone blink?” popped up on the screen.  It was only after Wang’s brother opened his eyes as wide as they could go that the message disappeared.  That Nikon is a Japanese company and wasn’t able to make a camera that would recognize “Asian” eyes, is absolutely ridiculous.

At first glance, it’s easy to laugh at both these instances of tech fail.  The video “HP Computers are Racist” made me guffaw; it’s already reached nearly 2 million hits since being uploaded a month and a half ago.  I’ve seen Jozjozjoz’ blog post referenced on numerous websites, as well.  But once the laughter dies down, we have to ask ourselves if it’s really funny, after all. Both of these incidents indicate that there is what some have called a "white preference" when it comes to making, selling and branding technology.

It would seem relatively commonsensical (to make up a word) that one would test a product made for the masses on a wide range of people.  Multiple millions of dollars go into R&D each year and yet it seems there was a lack of minorities involved the process of testing these products or perhaps that there was some sort of consensus that minorities wouldn’t be the main purchasers of these products.  It’s a glaring omission on either count for it dismisses minorities, their interest in technology, and alludes to stale (and particularly annoying) assumptions about purchasing power, access to technology, and the desire to own high-tech equipment. 

 Most often, the assumptions are that because of my ethnic group, I won’t have the money to buy a new computer, an MP3 player or any of the other cool gadgets that are out there.  I feel the after-effect of those assumptions every time I go into well staffed tech store to look for something and I have to hunt down a salesperson to help me, when I opt to pay for something in cash instead of with credit, or when someone, attempting to be kind, offers to show me the “more affordable” options in the store despite the research I complied on the (usually more expensive) product that I want.

It’s easy to dismiss people’s complaints about these kinds of incidents as being hypersensitive, but that victimizes the victim and doesn’t take the multinational companies to task for not doing their homework.  It’s like when Facebook published their results on the ethnic makeup of their users using an algorithm that used last names to determine ethnic makeup; the algorithm didn’t take into account mixed race Americans, the impact slavery has had on surnames (slaves were made to take the last names of their white masters, so someone with the last name of McGibbon can be black or white), or how some names are found in Asia and the US and are ethnically ambiguous, such as “Lee.”

Many will say what happened at HP and Nikon is just a glitch in the system, but for certain minority groups who often are rendered invisible in marketing campaigns, such as in ads for makeup and hair products, or hypervisible, like in ads for alcohol and cigarettes, it speaks to the same ole, same ole and it’s disappointing that a supposedly cutting-edge technology company is working by the same old rules.  

 

92

2 comments

"Many will say what happened at HP and Nikon is just a glitch in the system..."

I think it's interesting that similar sort of language is used to defend racism even outside explicitly technological realms.  Incidents of racial bias are not the result of individual chocies or fundamental institutional inequities, this line of thought holds, but rather from the system just not working the way it was "supposed to."  It's a bug in the code, not a problem with the code itself. 

It makes me wonder to what extent ideas about the objectivity and universality of technology and technological systems has bled into everyday discourse.  If we see machines as designed rationally and mathematically (not problematically or culturally), and we see our social world as a "machine" (well-oiled or otherwise) , then it becomes all too easy to assume that problems must be the result of a slipped gear, a glitchy line of code, a jostled circuitboard.  Doing so thus ignores the ways in which it is precisely the social machine--the material, imaginative, economic, and ideological infrastructure of society--that has had racism built into it.

The problem is often not a glitch in the system but the system itself.

100

 

Thank so much for your comment! You bring up some really good points, and I think what’s underneath them is that in reality, no one wants to believe they harbor prejudices or outdated ideas about others.  It’s easier to blame a machine than to really tackle problems.  Blaming a machine - a rational, unthinking object – allows us to say, “if a machine had a problem, then I am ok, because the machine made a bad decision and it has no feelings.”  I think everyone has a problem with this. We look toward our horoscope to explain why we had a bad day and full moons to explain why we had trouble communicating (although I will say I totally believe that full moons jack up the communication process. ^-^) so often we look outside of ourselves to explain why things don’t work, how things messed up…allowing ourselves to displace the blame. You’re right when you say the system is the problem, but since we made the system and we always already are interpolated within the system, it’s up to us to be really honest about our complicity in perpetuating the system.

 

107