Moments of Permanence - January 5th, 2010

About January 5th, 2010

Sometimes the racism is accidental: the problem is the system 02:10 pm
So, there's a kerfuffle.

HP's automatic face-tracking software tends to suck at tracking dark-skinned faces. There's a video. The camera tracks the white person, breaks if the black person comes in.

Here's how I see it:

This is, ultimately, because of racism.

But it's only a rather indirect symptom, in this case.

See, the camera is controlled by the software that drives the face recognition algorithm.

The odds are, the software was tested by the team who were writing it. And they would almost certainly have tested it on themselves, because why bother getting other people in when you yourself have a face to practice on?

The first reason racism is to blame is because it's less likely that the team included a dark-skinned person, because of institutional racism.

The reason this can show up so easily in this situation is that you are dealing with a camera - and a low-grade webcam, at that. The Angry Black Woman points out that adjusting the settings can make the difference to the camera successfully tracking dark-skinned faces, and that different shades of skin tone can make a difference.

For those of you who haven't spent much time playing with cameras or thinking about how they operate: cameras function by the detection of light. Dark-skinned faces reflect less light. That's why they're darker. This means they are more challenging from the perspective of automatic camera controls (and good photography in general).

So here we come to the second reason racism is to blame.

Let's imagine, for a second, that we lived in an alternate universe where black and white roles in history were reversed. The modern world in this alternate universe would have systemic advantage to black people and white people would be second-class citizens.

Now, in this alternate universe, someone is writing the same face-tracking software. They come up against that same issue - darker-skinned faces are harder to track. They realise this instantly, and they adjust the program to compensate automatically. Because they immediately factored dark skin into their plans, instead of just not thinking about it and casually overlooking the substantial segment of the population for whom their program would be rubbish.

This HP webcam thing is not about HP computers being racist (the guy is mostly kidding, but still), or about HP programmers at any point being deliberately racist; this is about a systemic problem with overlooking the existence and importance of people whose skin is darker than beige.

(In the original video, it's also, to an extent, a problem with the guy being backlit, but the camera manages to compensate pretty well for that on the white woman, so it's not actually an excuse.)

HP are not the problem. The problem is in society. This is a symptom.

(Meanwhile, yesterday a Maori who is also a total world history buff told me to go see the movie Avatar. I said I wasn't sure I wanted to, because I'd heard it was hideously colonialist and offensive. He said not to think about that stuff, and just enjoy the movie.)
Tags:

Top of Page Powered by Dreamwidth Studios