手机APP下载

您现在的位置: 首页 > 在线广播 > PBS高端访谈 > PBS访谈社会系列 > 正文

PBS高端访谈:医疗算法存在种族歧视

来源:可可英语 编辑:Wendy   可可英语APP下载 |  可可官方微信:ikekenet
  


手机扫描二维码查看全部内容
=g~3iv~#nHt]=kD

!~LqUrpdoY]vK~7=

Hari Sreenivasan: A recent study published in Science magazine found significant racial bias in an algorithm used by hospitals across the nation to determine who needs follow up care and who does not. Megan Thompson recently spoke with STAT's Shraddha Chakradhar, who explained what the researchers found.

smVdGS8ZNl

Megan Thompson: Where exactly was this bias coming from?

e15|!QmB;LPmqH

Shraddha Chakradhar: There are two ways that we can identify how sick a person is. One, is how many dollars are spent on that person. You know, the assumption being the more health care they come in for, the more treatment that they get, the more dollars they spend and presumably the sicker they are if they're getting all that treatment. And the other way is that, you know, we can measure actual biophysical things, you know, from lab tests, what kind of conditions or diseases they might have. So it seems like this algorithm was relying on the cost prediction definition. In other words, the more dollars a patient was projected to spend on the part of an insurance company or a hospital, then that was a sign of how sick they were going to be. And that seems to be where the bias emerged.

[oY&t896kPXsvqO%E

Megan Thompson: I understand that the researchers then sort of use the algorithm using a different type of data. Can you just tell us a little bit more about that? What did they use?

)AwGX(EoE~P2yGpd

1222.jpg

TkBYF8B(D@Cnn

Shraddha Chakradhar: Yeah. So instead of relying on just costs to predict which patients are going to need follow up care, they actually used biometric data, physical biophysical data, physiological data, and they saw a dramatic difference, you know, in the previous model. The algorithm missed some 48,000 extra chronic conditions that African-American patients had. But when they rejiggered the algorithm to look more at actual biological data, they brought that down to about 7,700. So it was about an 84 percent reduction in bias.

dYEUBse~*=v

Megan Thompson: Do we know anything about how the use of this biased algorithm actually affected patient care?

u]eHk(@WMo%5dn*PLaX

Shraddha Chakradhar: We don't actually know that. But as I mentioned, the algorithm is used by hospitals to help them flag patients who might need extra care in the coming year, whether it's, you know, an at-home nurse or making sure that they come in for regularly scheduled doctor's appointments. So we can only presume that if black patients, sicker black patients weren't being flagged accurately, that they also missed out on this follow up care.

(hzqKN7xJCm+uBBOqk

Megan Thompson: Are there any consequences for the company, Optum, that was behind this algorithm?

_Jb;Wz%r1=6Z#

Shraddha Chakradhar: Yes. So the day after the study came out, actually, New York regulators, the Department of Financial Services and the Department of Health sent a letter to the company saying they were investigating this algorithm and that the company had to show that the way the algorithm worked wasn't in violation of anti-discrimination laws in New York. So that investigation is pending. One encouraging thing is that when the researchers did the study, they actually reached back to Optum and let them know about the discrepancy in the data. And the company was glad to be told about it. And I'm told that they're working on a fix. And the other encouraging thing is that the researchers have actually now launched an initiative to help other companies who may be behind similar algorithms to help them fix any biases in their programs. So they've launched a program based out of the University of Chicago's Booth School to do this work on a pro bono basis so that they can sort of catch these things in other algorithms that might be used across the country.

e0*@glsSIl31X

Megan Thompson: All right, Shraddha Chakradhar of STAT, thank you so much for being with us.

q;,jGagiK2KK|Pp~_

Shraddha Chakradhar: Thank you for having me.

[~n72PYK]~KOy[M

plqQ^k|-i=eTLZ00Hqr8s_SpH,6AtM[U-;3*PrvOx

重点单词   查看全部解释    
reduction [ri'dʌkʃən]

想一想再看

n. 减少,缩小,(化学)还原反应,(数学)约分

 
deception [di'sepʃən]

想一想再看

n. 骗局,诡计,欺诈

 
biased ['baiəst]

想一想再看

adj. 有偏见的;结果偏倚的,有偏的

 
bias ['baiəs]

想一想再看

n. 偏见,斜纹
vt. 使偏心

联想记忆
initiative [i'niʃətiv]

想一想再看

adj. 创始的,初步的,自发的
n. 第一步

联想记忆
determine [di'tə:min]

想一想再看

v. 决定,决心,确定,测定

联想记忆
previous ['pri:vjəs]

想一想再看

adj. 在 ... 之前,先,前,以前的

联想记忆
affected [ə'fektid]

想一想再看

adj. 受影响的,受感动的,受疾病侵袭的 adj. 做

联想记忆
projected [prə'dʒektid]

想一想再看

adj. 投影的,投射 v. 投射(project的过去

 
definition [.defi'niʃən]

想一想再看

n. 定义,阐释,清晰度

联想记忆

发布评论我来说2句

    最新文章

    可可英语官方微信(微信号:ikekenet)

    每天向大家推送短小精悍的英语学习资料.

    添加方式1.扫描上方可可官方微信二维码。
    添加方式2.搜索微信号ikekenet添加即可。