From the New York Times:
When Algorithms Discriminate
JULY 9, 2015Claire Cain Miller
The online world is shaped by forces beyond our control, determining the stories we read on Facebook, the people we meet on OkCupid and the search results we see on Google. Big data is used to make decisions about health care, employment, housing, education and policing.
But can computer programs be discriminatory?
There is a widespread belief that software and algorithms that rely on data are objective. But software is not free of human influence. Algorithms are written and maintained by people, and machine learning algorithms adjust what they do based on people’s behavior. As a result, say researchers in computer science, ethics and law, algorithms can reinforce human prejudices.
Google’s online advertising system, for instance, showed an ad for high-income jobs to men much more often than it showed the ad to women, a new study by Carnegie Mellon University researchers found.
Research from Harvard University found that ads for arrest records were significantly more likely to show up on searches for distinctively black names or a historically black fraternity. The Federal Trade Commission said advertisers are able to target people who live in low-income neighborhoods with high-interest loans.
Claire Cain Miller is a representative promoter/consumer of the conventional wisdom (e.g., she pushed Ellen Pao). So, it’s worth noting how she begins her piece by assuming that human frailty and evil must be behind the disparate outcomes of algorithms. Progressives assume they are on the side of Science and Rationality, which have proven that all people are identical, so when robots discover differences, it must be due to Wreckers.

RSS

You should give “Wreckers” a hyperlink for the benefit of readers not yet fluent in Sailerese.
Google’s neural net based image recognition software identified a black woman as a gorilla.
Deconstructed,” When Algorithms Discriminate” is about a female tech writer trying to define ‘algorithm,’ and ultimately giving up:
“black box”
“hard to know why websites produce certain results”
etc.
Why don’t these computers advertise tampons equally to men and women?
Why don’t they advertise men’s clothing equally to men and women?
Why don’t they advertise Lexus/Audi/BMW to those no matter what their income bracket is?
The whole purpose of the algorithm is to discriminate. Google only gets paid when people click on the links, so they have a vested interest in only showing links that a person is likely to click.
And correlation isn’t causation. These people aren’t necessarily getting ads for criminal reports because they are black, but because they have searched for defense attorneys or for self-help legal advice previously. Something that young, black, men are more likely to have done as a result of their outsized propensity towards violence and their outsized representation in the criminal justice system.
The Singularity is here.
uuuuuuuuueeeeeeeeeeeeeduhhhhhhhhhhhhhhhhh!!!!
What use are algorithms if they do NOT discriminate?
The whole purpose of algorithms IS to discriminate.
If you search for ‘vacation in Greek Isles’, the algorithm better favor Greek Isles and not Russian, French, Japanese, Italian, Africa, Chinese, Filipino Isles.
Without algorithms, all searches will be the same since the system will treat everything equally.
So ALL algorithms, whatever their use or purpose, WILL and MUST discriminate.
Geez.
PS. Even if you take race out of the equation, an elite organization will discriminate in favor smart people over dumb people.
https://en.wikipedia.org/wiki/Wrecking_(Soviet_crime)
Claire Cain Miller is a see you next Tuesday kinda gal.
I doubt it. They surely know that people aren’t identical. They just want to skew the algorithms enough to comply with the social justice agenda, but no more.
Women are so funny about discrimination and mentioning differences between people. They seem to be entirely in favour of it or entirely against it, depending on who is sticking it to whom and why. They won’t hear a bad word said about some protected or admired group and won’t hear a good one about people who’ve been given the mark of Cain for whatever reason. Algorithms would be fine with Claire Cain Miller if they were sticking it to White, male gun owners or people who hog parking spaces on her street. In that case she’d be calling for Skynet to be built to punish them all the way to Gitmo and back.
Women’s brains seem to have no middle setting between shunning and embracing and so they can’t understand how a computer could coolly and unemotionally send people different ads without it being a high tech way to be mean or nice to people. The computers are being rude to protected groups and that’s not nice.
No Steve is right. Progressives like to assume that people are really only different in trivial or quirky ways but are identical in anything that really matters.
Of course they have science and rationality on their side. Haven’t you noticed that SJWs attach ‘phobia’ to the end of words thereby creating new sciency words for attitudes and viewpoints they don’t like? Seems to me you have conformancephobia or at the very least equalityphobia or perhaps just wontfollowyourbettersphobia.
@Steve, have you noticed that the pig in Animal Farm which is the unseen “wrecker” standing in for Trotsky, and which is to blame for every time things Don’t Go As They Should, is named… Snowball?
Does Snowball sound suspiciously like Whitey to you?
She’s not wrong, these processes are not neutral. The problem is that she is offended that algorithms are being modeled based on what people actually want and respond to, not what her political aspirations tell her must be foisted on society. Furthermore, the problem (for her) is that while humans have a social survival instinct to shut up about inconvenient facts, and if confronted to deny that they are doing so, a computer can only do one or the other: the computer can only do what it’s programmed or trained to do, which necessitates making all those unutterable rules explicit in the code, which makes the intent of the designer easily discernable.
Maybe a bit off-topic, but here’s a great quote from 1984.
It was always the women, and above all the young ones, who were the most bigoted adherents of the Party, the swallowers of slogans, the amateur spies and nosers−out of unorthodoxy.
Pretty funny about those bigoted programmers who are making sure that their algorithms determine the sex of the user and deny females the good jobs. I'm sure they don't have anything better to do.
"black box"
"hard to know why websites produce certain results"
etc.Replies: @pyrrhus
This story is an example of the fact that the slightest comprehension of math or statistics disqualifies you from being a right thinking SJW…..
Anupam Datta finds no such flaw in the algorithms produced by Microsoft/Bing. I wonder why: ” His group’s work with
Microsoft Research produced the first automated privacy compliance
analysis of the production code of an Internet-scale system — the big data
analytics pipeline for Bing, Microsoft’s search engine. ”
It was always the women, and above all the young ones, who were the most bigoted adherents of the Party, the swallowers of slogans, the amateur spies and nosers−out of unorthodoxy.Replies: @Tim Howells
Wow – that’s a great quote – and totally on topic.
Pretty funny about those bigoted programmers who are making sure that their algorithms determine the sex of the user and deny females the good jobs. I’m sure they don’t have anything better to do.
Worse, progressives assume that people, whether or not they’re fundamentally same/different in values, ethics, ability, cooperation, etc., are nonetheless totally still malleable, at least in the departments which coincidentally enough match up with the progressive fetish catalog for correct views on education, sex, and government power, e.g. inside the Muslim wearing a dynamite vest could be a transgender fashion model struggling to get out, or at least a potential social worker dedicated to inner city youth
Algorithms are guilty of the crime of noticing.