Artwork

内容由iHeartPodcasts提供。所有播客内容(包括剧集、图形和播客描述)均由 iHeartPodcasts 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal
Player FM -播客应用
使用Player FM应用程序离线!

Arsenic in My Muffins (with Kasia Chmielinski)

37:07
 
分享
 

Manage episode 312946467 series 2781813
内容由iHeartPodcasts提供。所有播客内容(包括剧集、图形和播客描述)均由 iHeartPodcasts 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

Baratunde knows what is healthy to eat or not, thanks to the required nutrition labels on our food. But how do we know the ingredients in the algorithms and AI we depend on are safe to use? Baratunde speaks with Kasia Chmielinksi about the Data Nutrition Project, which helps data scientists, developers, and product managers assess the viability, health, and quality of the data that feeds their algorithms and influences our decisions daily.

Guest: Kasia Chmielinski

Bio: Co-Founder of the Data Nutrition Project, an affiliate at Harvard’s Berkman Klein Center for Internet & Society, senior research advisor at the Partnership on AI

Online: The Data Nutrition Project website; Kasia on Twitter @kaschm

Show Notes + Links

Go to howtocitizen.com to sign up for show news, AND (coming soon!) to start your How to Citizen Practice.

Please show your support for the show in the form of a review and rating. It makes a huge difference with the algorithmic overlords!

We are grateful to Kasia Chmielinski for joining us! Follow them at @kaschm on Twitter, or find more of their work at datanutrition.org.

ACTIONS

- PERSONALLY REFLECT

Like people, machines are shaped by the context in which they were created. So if we think of machines and algorithmic systems as children who are learning from us - their parents - what kind of parents do we want to be? How do we want to raise our machines to be considerate, fair, and to build a better world than the one we are in today?

- BECOME INFORMED

Watch: Coded Bias

Listen: Radical AI Podcast

Read: Race after Technology, Weapons of Math Destruction, Data Feminism

Make Choices: *privacy not included (consumer guide for buying technologies)

- PUBLICLY PARTICIPATE

Donate to these groups on the front lines ensuring the future of AI is human and just: Algorithmic Justice League, ACLU, Electronic Frontier Foundation

Discuss: Host a book club! The books above are really great platforms to gather folks who want to learn from the literature and each other.

Attend a lecture or event: Data & Society, AI Now

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

  continue reading

59集单集

Artwork
icon分享
 
Manage episode 312946467 series 2781813
内容由iHeartPodcasts提供。所有播客内容(包括剧集、图形和播客描述)均由 iHeartPodcasts 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

Baratunde knows what is healthy to eat or not, thanks to the required nutrition labels on our food. But how do we know the ingredients in the algorithms and AI we depend on are safe to use? Baratunde speaks with Kasia Chmielinksi about the Data Nutrition Project, which helps data scientists, developers, and product managers assess the viability, health, and quality of the data that feeds their algorithms and influences our decisions daily.

Guest: Kasia Chmielinski

Bio: Co-Founder of the Data Nutrition Project, an affiliate at Harvard’s Berkman Klein Center for Internet & Society, senior research advisor at the Partnership on AI

Online: The Data Nutrition Project website; Kasia on Twitter @kaschm

Show Notes + Links

Go to howtocitizen.com to sign up for show news, AND (coming soon!) to start your How to Citizen Practice.

Please show your support for the show in the form of a review and rating. It makes a huge difference with the algorithmic overlords!

We are grateful to Kasia Chmielinski for joining us! Follow them at @kaschm on Twitter, or find more of their work at datanutrition.org.

ACTIONS

- PERSONALLY REFLECT

Like people, machines are shaped by the context in which they were created. So if we think of machines and algorithmic systems as children who are learning from us - their parents - what kind of parents do we want to be? How do we want to raise our machines to be considerate, fair, and to build a better world than the one we are in today?

- BECOME INFORMED

Watch: Coded Bias

Listen: Radical AI Podcast

Read: Race after Technology, Weapons of Math Destruction, Data Feminism

Make Choices: *privacy not included (consumer guide for buying technologies)

- PUBLICLY PARTICIPATE

Donate to these groups on the front lines ensuring the future of AI is human and just: Algorithmic Justice League, ACLU, Electronic Frontier Foundation

Discuss: Host a book club! The books above are really great platforms to gather folks who want to learn from the literature and each other.

Attend a lecture or event: Data & Society, AI Now

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

  continue reading

59集单集

Alle Folgen

×
 
Loading …

欢迎使用Player FM

Player FM正在网上搜索高质量的播客,以便您现在享受。它是最好的播客应用程序,适用于安卓、iPhone和网络。注册以跨设备同步订阅。

 

快速参考指南

边探索边听这个节目
播放