Bigotry in the Software

Programmers are always dealing with bugs in their software, but prejudices may be just as much as of a concern.

In my browsing, I happened to stumble across this very interesting NPR story about computer programs and algorithms perpetuating  sexism and racism. You can listen  to it here:

https://www.npr.org/player/embed/470422089/470447092

Fascinated,I did more research. And I stumbled across two almost identical articles. And I am certain there are countless more examples to be found.

The first was a Wired story  from 2009 commenting on a YouTube video showing how video tracking software was displaying racist biases by not recognizing a black gentleman as a face.

Seven years later and we are still dealing with the same issues.

This Ford Foundation piece namely targets the issues we run into with big-data algorithms. It outlines how and why these biases show up with some quick videos detailing this ideas. But importantly, it gives some of the solutions the experts have postulated to solve this problem. Throughout it heavily references what I can only assume is the same Harvard study mentioned in the NPR article.

 

To be honest though, it is easy to see where these biases start. I assembled this table using data from the Bureau of Labor Statistics. Even as someone on the inside, the results are painfully startling.

cosc-stats

 

What I find truly thought-provoking about both halves of this issue of computers discriminating is that it’s not the computers, it’s the people behind them and in front, if you will.

Of course it is not digital components that are bigoted. This should be obvious. As it is often said in computing, technology is very stupid but very obedient.  Computers can only, and will only do what they are told. Even with those big-data algorithms that “learn” it is only because that is what they have been programed to do. They are not being any more of less racist or sexist than society.  Yet it this idea that computers cannot have a bias perpetuates so thoroughly, it can be hard to recognize the truth. And that makes it dangerous. It can become all to easy to hide behind  ” it is just a computer” or “it is a machine, it can’t show bias”and forget the human component. The only way to fix the problem is by refusing to give these stereotypes validity. This isn’t done, however, by ignoring them or pretending the issues aren’t a concern but rather by acknowledging the problem and admitting that we, personally, have biases. Only by bring them to the forefront of our personal and societal conscious can we look for solutions.

 

 

Image Source

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s