Can Computer Algorithms Exhibit Bias?

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Computer scientists believe that machine-learning algorithms can actually learn bias as they mimic human thought behavior. Such algorithms are frequently used in applications such as hiring and loans, so we don't want that kind of thing.

The researchers, led by Suresh Venkatasubramanian, an associate professor in the University of Utah’s School of Computing, have discovered a technique to determine if such software programs discriminate unintentionally and violate the legal standards for fair access to employment, housing and other opportunities. The team also has determined a method to fix these potentially troubled algorithms.
 
Wait a second. You're telling me that now using a computer program to sift through a resume/cover letter and ranking it is now considered biased and shouldn't be used? How about we'll all just apply for a job now by being given a random number, and we can have a lottery to decide who's hired. Screw qualifications.

Of course the algorithms could be used to determine race and gender. I know it said it wasn't being used now, but they're afraid it could be used in the future. I mean, data mining can already determine if a woman is pregnant just depending on what they buy. How is it any more surprising that people of different ethnicities and genders might have biases in their writing style? Of course we judge on the hiring process about how the resume is written.
 
It doesn't have to be intentional. Training sets used for testing might not be representative of data it will evaluate in the near or distant future. The model its based on could be flawed or it could have been implemented incorrectly.

The problem is that flaws aren't always obvious and what they're suggesting is essentially additional testing/validation, which is what will or will not be done anyways. :p
 
There will always be bias, thats how a selection is made.
 
A.I bias is an interesting concept, but these sort of stories always seem to be about "equal outcomes" as opposed to the "equal opportunity"

Especially since housing and finance were areas discussed, as those areas have been rife with conscious and subconscious bias...now taking the human element out is still not good enough.
 
I mean, data mining can already determine if a woman is pregnant just depending on what they buy.

How?

No, I don't want your "loyalty card".
No, you can't have my phone number.
I pay in cash.
 
Fuck those racist computers for saying no to risky applicants. We can't have a repeat of 2008 fast enough, imho.
 
So what this really means is that even a computer program that analyses statistics gets biased. Political correctness is killing this world.
 
Exhibit, learn, I think the best way to say it would be that an algorithm will often reinforce a bias that already exists...which presents a problem when trying to change the status quo.
 
How?

No, I don't want your "loyalty card".
No, you can't have my phone number.
I pay in cash.
You started getting gas at that place by the maternity store. Your IP address does searches comparing Pampers to Luvs. Your license plate was scanned by a cop roaming the Babies-R-Us parking lot. Your face was in the background of a Barnes and Noble selfie where you were holding a baby names book. There were unusual spikes in your electric bill from the garage door opener when you sent your husband out for pickles and Oreos at 3AM. Your health insurance provider just increased your rates.

There's a thousand different ways to figure it out. That's the thing about big data. It's BIG. And if AI learns that there's a correlation between something like quality of work and race/health/gender/etc... Major shitstorm. Example: You and I may not hire an Indian guy because he got his degree at the University of Crapville, New Delhi and we only take applicants from accredited universities. The computer, however, just sees that we didn't hire an Indian guy, and starts trashing all Indian resumes before a human ever sees them. HR doesn't catch on because they're generally clueless, and five years down the road a damned good candidate for the position affirmative actions our company. He ends up owning it, and we end up in jail. It's a simple example, and one that I guarantee has been thought of and fixed - But what about the ones we haven't thought of yet?

AI bias is scary... And awesome.
 
Example: A computer that sorts through applications rejects African Americans at a much higher rate than other races.

Is that algorithm racist?

If you said yes, what if I added the fact that the algorithm was simply sorting people who applied for a new mortgage by credit score. It is an unfortunate fact that African Americans as a group have lower average credit scores. So using credit scores to reject mortgage applications will affect a disproportionate number of African Americans.

So the question remains, is the computer algorithm racist?
 
Example: A computer that sorts through applications rejects African Americans at a much higher rate than other races.

Is that algorithm racist?

If you said yes, what if I added the fact that the algorithm was simply sorting people who applied for a new mortgage by credit score. It is an unfortunate fact that African Americans as a group have lower average credit scores. So using credit scores to reject mortgage applications will affect a disproportionate number of African Americans.

So the question remains, is the computer algorithm racist?

No but the people claiming so are both hypocrite and naive.
 
Example: A computer that sorts through applications rejects African Americans at a much higher rate than other races.

Is that algorithm racist?

If you said yes, what if I added the fact that the algorithm was simply sorting people who applied for a new mortgage by credit score. It is an unfortunate fact that African Americans as a group have lower average credit scores. So using credit scores to reject mortgage applications will affect a disproportionate number of African Americans.

So the question remains, is the computer algorithm racist?

No, but that is also why we sometimes create different classes for employment and services ... if employment was always based on the best candidate (including experience) then you would almost never hire a new college graduate (since they would never be competitive with experienced workers) ... so companies create jobs that can only be filled by a new college graduate (is that discriminatory or just a method to force the inclusion of a group the company needs to hire to be a more balanced company) ;)
 
No, but that is also why we sometimes create different classes for employment and services ... if employment was always based on the best candidate (including experience) then you would almost never hire a new college graduate (since they would never be competitive with experienced workers) ... so companies create jobs that can only be filled by a new college graduate (is that discriminatory or just a method to force the inclusion of a group the company needs to hire to be a more balanced company) ;)

Because if you don't hire young, at some point your employees are either all dead, retired or senile. There is a statistically quantifiable future shortcoming.

btw, employers would much rather snatch an employee with 2 to 5 years experience from elsewhere than hire a grad. But you can't depend on those being available where as hiring from college is more or less turn-key.
 
Because if you don't hire young, at some point your employees are either all dead, retired or senile. There is a statistically quantifiable future shortcoming.

That is not as true as it used to be since both companies and employees now view their employment with one company as temporary ... I would say the number of employees who stay with a company 10 years or longer is greatly diminishing and soon will be the exception and not the rule (if it is not already)

But the argument is still there, if you can hire a less qualified young person for strategic reasons (and that is acceptable), then why can't you hire an equivalently qualified minority or women for strategic reasons ... in for a penny, in for a pound ;)
 
Back
Top