手机APP下载

您现在的位置: 首页 > 双语阅读 > 双语新闻 > 职场双语 > 正文

机器人招聘系统的人类偏见

来源:可可英语 编辑:alice   可可英语APP下载 |  可可官方微信:ikekenet

Advances in artificial intelligence and the use of big data are changing the way many large companies recruit for entry level and junior management positions. These days, graduates’ CVs may well have to impress an algorithm rather than an HR executive.

人工智能(AI)的进步和大数据的使用,正在改变许多大公司招聘入门级和初级管理职位员工的方式。如今,毕业生的简历很可能不得不打动某个算法,而不是一位人力资源高管。
“There’s been a dramatic increase in the use of automation in [high] volume selection processes over the past two years,” says Sophie Meaney, managing director, client solutions and strategic development at Amberjack, which provides and advises on automated recruitment processes.
“过去两年里,在大流量筛选的过程中使用自动化的情况出现戏剧性增加,”Amberjack负责客户解决方案和战略发展的董事总经理索菲?米尼(Sophie Meaney)说。该公司提供自动化招聘流程以及相关咨询服务。
While algorithms supposedly treat each application equally, experts are divided about whether so-called robo-recruitment promises an end to human bias in the selection process — or whether it may in fact reinforce it.
尽管算法理应平等对待每份申请,但在机器人招聘(robo-recruitment)将会终结遴选过程中的人类偏见、还是实际上也许会强化人类偏见的问题上,专家们看法不一。
“AI systems are not all equal,” says Loren Larsen, chief technology officer for HireVue, which has developed an automated video interview analysis system. It has been used by companies including Unilever, the consumer goods group, Vodafone, the telecoms company, and Urban Outfitters, the retailer. “I think you have to look [at] the science team behind the work,” says Mr Larsen.
“AI系统并非完全平等,”HireVue首席技术官洛伦?拉森(Loren Larsen)说。该公司开发出一套自动化的视频面试分析系统。包括消费品集团联合利华(Unilever)、电信运营商沃达丰(Vodafone)和零售商Urban Outfitters在内的很多公司已采用了该系统。“我认为,你必须考察一下这项工作背后的科学团队,”拉森说。
The problem, experts say, is that to find the best candidates an algorithm has first to be told what “good” looks like in any given organisation. Even if it is not fed criteria that seem discriminatory, an efficient machine-learning system will quickly be able to replicate the characteristics of existing workers. If an organisation has favoured white male graduates from prestigious universities, the algorithm will learn to select more of the same.
专家们表示,问题在于,要想找出最佳候选人,首先必须告诉算法在任何一个给定组织里,“好”是什么样子。即便没有馈入似乎有成见的标准,一套高效率的机器学习系统将很快能够复制现有员工的特点。如果某个组织喜欢知名大学的白人男性毕业生,算法将学会选出更多这一类别的人。
The growing reliance on automation to judge suitability for everything from a loan to a job or even to probation in the criminal justice system, worries Yuriy Brun, an associate professor specialising in software engineering at the University of Massachusetts.
从一笔贷款、一份工作,到刑事司法系统中的缓刑决定,在判断众多事情的合适性方面越来越依赖自动化,让马萨诸塞大学(University of Massachusetts)软件工程副教授尤里?布朗(Yuriy Brun)感到不安。
“A lot of the time a company will put out software but they don’t know if it is discriminatory,” he says. He points to the Compas tool in use in several US states to help assess a person’s likelihood to reoffend, which was reported to have discriminated against African Americans.
“很多时候,一家公司推出软件,却不知道软件是否有成见,”他说。他提到了美国好几个州正在使用的帮助评估一个人再犯罪可能性的Compas工具。据报道,该工具倾向于歧视非洲裔美国人。
Prof Brun explains that, given the use of big data, algorithms will inevitably learn to discriminate. “People see that this is a really important problem. There’s a real danger of making things worse than they already are,” he says. His concern led him to co-develop a tool that tests systems for signs of bias.
布朗教授解释称,鉴于大数据的使用,算法将不可避免地学会歧视。“人们看到,这是一个真正重要的问题。有一种让事情变得比现在更糟糕的真实危险,”他说。这种担心导致他与人联合开发出一种检测系统偏见迹象的工具。
Many of those working with robo-recruiters are more optimistic. Kate Glazebrook, chief executive of Applied, a hiring platform, says her mission is to encourage hiring managers to move away from what she calls “proxies for quality” — indicators such as schools or universities — and move to more evidence-based methods.
与机器人招聘合作的很多人更为乐观。招聘平台Applied首席执行官凯特?格莱兹布鲁克(Kate Glazebrook)表示,她的使命是鼓励招聘经理远离她所说的“质素指标”,比如学校或大学等,转向在更大程度上基于证据的方法。
“In general, the more you can make the hiring process relevant, the more likely that you will get the right person for the job,” she says.
“总体来说,你能让招聘流程变得越相关,你就越有可能为工作岗位找到合适人选,”她说。
Applied anonymises tests that candidates complete online and feeds them, question by question, to human assessors. Every stage of the process has been designed to strip out bias.
Applied把候选人在线完成的测试隐去姓名,然后把所有问题逐一提供给人类评估者。整个流程每一阶段的设计都是为了剔除偏见。
With the same aim, Unilever decided in 2016 to switch to a more automated process for its graduate-level entry programme, which has about 300,000 applicants a year for 800 positions.
带着相同目的,联合利华在2016年决定将其毕业生招聘计划转向一个自动化程度更高的流程。每年有大约30万名候选人申请该公司的800个工作岗位。
Unilever worked with Amberjack, HireVue and Pymetrics, another high volume recruitment company, which developed a game-based test in which candidates are scored on their ability to take risks and learn from mistakes, as well as on emotional intelligence.
联合利华跟Amberjack、HireVue以及另一家大流量招聘公司Pymetrics合作。Pymetrics开发了一种基于游戏的测试,通过测试对候选人在承担风险和从错误中学习的能力、以及情商进行打分。

机器人招聘系统的人类偏见.jpg

Unilever says the process has increased the ethnic diversity of its shortlisted candidates and has been more successful at selecting candidates who will eventually be hired.

联合利华表示,这种方法提高了入围候选人名单的民族多样性,而且在遴选最终将被聘用的候选人方面更为成功。
“The things that we can do right now are stunning, but not as stunning as we’re going to be able to do next year or the year after,” says Mr Larsen.
“我们现在做得到的事情令人惊叹,但我们明年或后年能够做到的事情将会更加令人惊叹,”拉森说。
Still, robo-recruiters must be regularly tested in case bias has crept in, says Frida Polli, chief executive of Pymetrics. “The majority of algorithmic tools are most likely perpetuating bias. The good ones should have auditing.”
话虽如此,Pymetrics首席执行官弗里达?波利(Frida Polli)说,机器人招聘系统仍必须接受定期测试,以防偏见渗入。“大部分算法工具很可能会强化偏见。好的算法应当有审核。”

重点单词   查看全部解释    
diversity [dai'və:siti]

想一想再看

n. 差异,多样性,分集

联想记忆
bias ['baiəs]

想一想再看

n. 偏见,斜纹
vt. 使偏心

联想记忆
associate [ə'səuʃieit]

想一想再看

n. 同伴,伙伴,合伙人
n. 准学士学位获得

联想记忆
majority [mə'dʒɔriti]

想一想再看

n. 多数,大多数,多数党,多数派
n.

 
application [.æpli'keiʃən]

想一想再看

n. 应用; 申请; 专心
n. 应用软件程序

 
dramatic [drə'mætik]

想一想再看

adj. 戏剧性的,引人注目的,给人深刻印象的

联想记忆
inevitably [in'evitəbli]

想一想再看

adv. 不可避免地

 
efficient [i'fiʃənt]

想一想再看

adj. 效率高的,胜任的

联想记忆
loan [ləun]

想一想再看

n. 贷款,借出,债权人
v. 借,供应货款,

 
analysis [ə'næləsis]

想一想再看

n. 分析,解析

联想记忆

发布评论我来说2句

    最新文章

    可可英语官方微信(微信号:ikekenet)

    每天向大家推送短小精悍的英语学习资料.

    添加方式1.扫描上方可可官方微信二维码。
    添加方式2.搜索微信号ikekenet添加即可。