Amazon Scraps Biased Recruiting AI

This is total nonsense! If the AI is basing it's decision to hire men by looking at the M F box then remove the M F box. Then the AI will be forced to hire based on the best qualified regardless of sex. BUT, since the fix is trivial and yet it obviously still hired more men than women. Then they MUST stop using AI because it only cares about the best qualified and seems to be ignoring the SJW dogma.


Actually, after reading a couple of other posts I start coming to a different viewpoint on this.

First is, I've never listed my "maleness" on a resume I have written. Nowhere do I type MALE, or FEMALE for that matter. But I do realize that "Mr. Price" and "His experience working with ....." would likely lead any HR people to assume that I am in fact, male.

I had not considered that the goal may have been to use AI to strip indications of gender from the resumes, and then, the report suggests, select potential candidates for further consideration by hiring staff.

Or perhaps the issue is for the AI to "parse" the resumes into database formats and not edit out gender at all, simply make it's selections neutrally.
 
Actually, after reading a couple of other posts I start coming to a different viewpoint on this.

First is, I've never listed my "maleness" on a resume I have written. Nowhere do I type MALE, or FEMALE for that matter. But I do realize that "Mr. Price" and "His experience working with ....." would likely lead any HR people to assume that I am in fact, male.

I had not considered that the goal may have been to use AI to strip indications of gender from the resumes, and then, the report suggests, select potential candidates for further consideration by hiring staff.

Or perhaps the issue is for the AI to "parse" the resumes into database formats and not edit out gender at all, simply make it's selections neutrally.

The thing to remember about deep learning AI is that it's trained, not told what to do.

For deep learning a reward/score is defined by the programmers and that's what the AI uses as it learns about how to rank candidates.

If the AI is fed large amounts of male applications then it will start to believe all good applications are written by males; there are differences in how males and females write resumes like word choice and how they present themselves (neither being 'the better way' mind you, just different).
 
The thing to remember about deep learning AI is that it's trained, not told what to do.

For deep learning a reward/score is defined by the programmers and that's what the AI uses as it learns about how to rank candidates.

If the AI is fed large amounts of male applications then it will start to believe all good applications are written by males; there are differences in how males and females write resumes like word choice and how they present themselves (neither being 'the better way' mind you, just different).


I'm certain you are right, it sure sounds right.

But I can't imagine that there are no goals set for the outcome. That there isn't an end point, a capability or capacity or dollar point that the effort is intended to reach.

As such, I can see where such targets could intrude upon the learning process (y)
 
Actually, after reading a couple of other posts I start coming to a different viewpoint on this.

First is, I've never listed my "maleness" on a resume I have written. Nowhere do I type MALE, or FEMALE for that matter. But I do realize that "Mr. Price" and "His experience working with ....." would likely lead any HR people to assume that I am in fact, male.

I had not considered that the goal may have been to use AI to strip indications of gender from the resumes, and then, the report suggests, select potential candidates for further consideration by hiring staff.

Or perhaps the issue is for the AI to "parse" the resumes into database formats and not edit out gender at all, simply make it's selections neutrally.

I have never included the words Mr (and I am a male) or any indication of gender on my resume other than my name. If this was done the right way they shouldve gone through and stripped out the name and any gender pronouns. What would be left behind would be a resume that was genderless and full of what amazon wanted.

Without knowing their process its very difficult to say where it went wrong or if it even went wrong at all. It could very well be that men simply were more qualified for those positions amongst that pool of applicants. I have always argued that a TRUE fair hiring system doesnt mandate quotas for women or minorities. Because hiring someone because they are a female or minority isnt fair to anyone, least of all them. If they are qualified by all means give them the job but dont discriminate in their favor...
 
I have never included the words Mr (and I am a male) or any indication of gender on my resume other than my name. If this was done the right way they shouldve gone through and stripped out the name and any gender pronouns. What would be left behind would be a resume that was genderless and full of what amazon wanted.

Without knowing their process its very difficult to say where it went wrong or if it even went wrong at all. It could very well be that men simply were more qualified for those positions amongst that pool of applicants. I have always argued that a TRUE fair hiring system doesnt mandate quotas for women or minorities. Because hiring someone because they are a female or minority isnt fair to anyone, least of all them. If they are qualified by all means give them the job but dont discriminate in their favor...


Then we pretty much agree on this. Still, as I said before, we don't actually know what their goal for this AI was do we?
 
Back
Top