Artwork

内容由Petrie-Flom Center and Glenn Cohen提供。所有播客内容(包括剧集、图形和播客描述)均由 Petrie-Flom Center and Glenn Cohen 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal
Player FM -播客应用
使用Player FM应用程序离线!

Is That AI Racist?

16:47
 
分享
 

Manage episode 407507130 series 3562141
内容由Petrie-Flom Center and Glenn Cohen提供。所有播客内容(包括剧集、图形和播客描述)均由 Petrie-Flom Center and Glenn Cohen 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

How AI Bias Is Affecting Health Care—And What We Can Do About It

People are biased, and people build AI, so AI are biased, too. When AI is used in hospitals to treat patients, that bias comes to health care.

For example, a 2019 paper in Science found that a commercial risk-prediction tool was less likely to refer equally sick Black people than white people to receive extra care resources. In fact, only 17.7% of patients that the algorithm assigned to get extra care were Black, but, if the algorithm were unbiased, the percentage would be much higher—46.5%.

This episode will look at how the racial disparities baked into the health care system also make their way into the AI that the health care system uses, creating a vicious cycle. Nic Terry (an expert in the intersection of health, law, and technology) and Ravi Parikh (a practicing oncologist and bioethicist) will discuss legal and ethical concerns. Michael Abramoff (an ophthalmologist, AI pioneer, and entrepreneur) will share how he’s trying to build a fairer AI

Created with support from the Gordon and Betty Moore Foundation and the Cammann Fund at Harvard University.

  continue reading

8集单集

Artwork
icon分享
 
Manage episode 407507130 series 3562141
内容由Petrie-Flom Center and Glenn Cohen提供。所有播客内容(包括剧集、图形和播客描述)均由 Petrie-Flom Center and Glenn Cohen 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

How AI Bias Is Affecting Health Care—And What We Can Do About It

People are biased, and people build AI, so AI are biased, too. When AI is used in hospitals to treat patients, that bias comes to health care.

For example, a 2019 paper in Science found that a commercial risk-prediction tool was less likely to refer equally sick Black people than white people to receive extra care resources. In fact, only 17.7% of patients that the algorithm assigned to get extra care were Black, but, if the algorithm were unbiased, the percentage would be much higher—46.5%.

This episode will look at how the racial disparities baked into the health care system also make their way into the AI that the health care system uses, creating a vicious cycle. Nic Terry (an expert in the intersection of health, law, and technology) and Ravi Parikh (a practicing oncologist and bioethicist) will discuss legal and ethical concerns. Michael Abramoff (an ophthalmologist, AI pioneer, and entrepreneur) will share how he’s trying to build a fairer AI

Created with support from the Gordon and Betty Moore Foundation and the Cammann Fund at Harvard University.

  continue reading

8集单集

所有剧集

×
 
Loading …

欢迎使用Player FM

Player FM正在网上搜索高质量的播客,以便您现在享受。它是最好的播客应用程序,适用于安卓、iPhone和网络。注册以跨设备同步订阅。

 

快速参考指南

边探索边听这个节目
播放