Are hiring algorithms fair? They're too opaque to tell, study finds

5 years ago
Anonymous $xdcOWPpsb_

https://www.sciencedaily.com/releases/2019/11/191120175616.htm

Hiring decisions are also rife with human bias, leading some organizations to hand off at least part of their employee searches to outside tech companies who screen applicants with machine learning algorithms. If humans have such a hard time finding the best fit for their companies, the thinking goes, maybe a machine can do it better and more efficiently.

But new research from a team of Computing and Information Science scholars at Cornell University raises questions about those algorithms and the tech companies who develop and use them: How unbiased is the automated screening process? How are the algorithms built? And by whom, toward what end, and with what data?

Are hiring algorithms fair? They're too opaque to tell, study finds

Nov 21, 2019, 1:33pm UTC
https://www.sciencedaily.com/releases/2019/11/191120175616.htm > Hiring decisions are also rife with human bias, leading some organizations to hand off at least part of their employee searches to outside tech companies who screen applicants with machine learning algorithms. If humans have such a hard time finding the best fit for their companies, the thinking goes, maybe a machine can do it better and more efficiently. > But new research from a team of Computing and Information Science scholars at Cornell University raises questions about those algorithms and the tech companies who develop and use them: How unbiased is the automated screening process? How are the algorithms built? And by whom, toward what end, and with what data?