Artwork

内容由Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology提供。所有播客内容(包括剧集、图形和播客描述)均由 Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal
Player FM -播客应用
使用Player FM应用程序离线!

When the "Person" Abusing Your Child is a Chatbot: The Tragic Story of Sewell Setzer

49:10
 
分享
 

Manage episode 446587432 series 2503772
内容由Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology提供。所有播客内容(包括剧集、图形和播客描述)均由 Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

Content Warning: This episode contains references to suicide, self-harm, and sexual abuse.

Megan Garcia lost her son Sewell to suicide after he was abused and manipulated by AI chatbots for months. Now, she’s suing the company that made those chatbots. On today’s episode of Your Undivided Attention, Aza sits down with journalist Laurie Segall, who's been following this case for months. Plus, Laurie’s full interview with Megan on her new show, Dear Tomorrow.

Aza and Laurie discuss the profound implications of Sewell’s story on the rollout of AI. Social media began the race to the bottom of the brain stem and left our society addicted, distracted, and polarized. Generative AI is set to supercharge that race, taking advantage of the human need for intimacy and connection amidst a widespread loneliness epidemic. Unless we set down guardrails on this technology now, Sewell’s story may be a tragic sign of things to come, but it also presents an opportunity to prevent further harms moving forward.

If you or someone you know is struggling with mental health, you can reach out to the 988 Suicide and Crisis Lifeline by calling or texting 988; this connects you to trained crisis counselors 24/7 who can provide support and referrals to further assistance.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
RECOMMENDED MEDIA

The CHT Framework for Incentivizing Responsible AI Development

Further reading on Sewell’s case

Character.ai’s “About Us” page

Further reading on the addictive properties of AI
RECOMMENDED YUA EPISODES

AI Is Moving Fast. We Need Laws that Will Too.

This Moment in AI: How We Got Here and Where We’re Going

Jonathan Haidt On How to Solve the Teen Mental Health Crisis

The AI Dilemma

  continue reading

125集单集

Artwork
icon分享
 
Manage episode 446587432 series 2503772
内容由Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology提供。所有播客内容(包括剧集、图形和播客描述)均由 Center for Humane Technology, Tristan Harris, Aza Raskin, and The Center for Humane Technology 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

Content Warning: This episode contains references to suicide, self-harm, and sexual abuse.

Megan Garcia lost her son Sewell to suicide after he was abused and manipulated by AI chatbots for months. Now, she’s suing the company that made those chatbots. On today’s episode of Your Undivided Attention, Aza sits down with journalist Laurie Segall, who's been following this case for months. Plus, Laurie’s full interview with Megan on her new show, Dear Tomorrow.

Aza and Laurie discuss the profound implications of Sewell’s story on the rollout of AI. Social media began the race to the bottom of the brain stem and left our society addicted, distracted, and polarized. Generative AI is set to supercharge that race, taking advantage of the human need for intimacy and connection amidst a widespread loneliness epidemic. Unless we set down guardrails on this technology now, Sewell’s story may be a tragic sign of things to come, but it also presents an opportunity to prevent further harms moving forward.

If you or someone you know is struggling with mental health, you can reach out to the 988 Suicide and Crisis Lifeline by calling or texting 988; this connects you to trained crisis counselors 24/7 who can provide support and referrals to further assistance.

Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
RECOMMENDED MEDIA

The CHT Framework for Incentivizing Responsible AI Development

Further reading on Sewell’s case

Character.ai’s “About Us” page

Further reading on the addictive properties of AI
RECOMMENDED YUA EPISODES

AI Is Moving Fast. We Need Laws that Will Too.

This Moment in AI: How We Got Here and Where We’re Going

Jonathan Haidt On How to Solve the Teen Mental Health Crisis

The AI Dilemma

  continue reading

125集单集

所有剧集

×
 
Loading …

欢迎使用Player FM

Player FM正在网上搜索高质量的播客,以便您现在享受。它是最好的播客应用程序,适用于安卓、iPhone和网络。注册以跨设备同步订阅。

 

快速参考指南