Artwork

内容由Upol Ehsan, Shea Brown, Upol Ehsan, and Shea Brown提供。所有播客内容(包括剧集、图形和播客描述)均由 Upol Ehsan, Shea Brown, Upol Ehsan, and Shea Brown 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal
Player FM -播客应用
使用Player FM应用程序离线!

🔥 The Taylor Swift Factor: Deep fakes & Responsible AI | irResponsible AI EP3S01

22:59
 
分享
 

Manage episode 421983738 series 3578042
内容由Upol Ehsan, Shea Brown, Upol Ehsan, and Shea Brown提供。所有播客内容(包括剧集、图形和播客描述)均由 Upol Ehsan, Shea Brown, Upol Ehsan, and Shea Brown 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

Got questions or comments or topics you want us to cover? Text us!

As they say, don't mess with Swifties. This episode irResponsible AI is about the Taylor Swift Factor in Responsible AI:
✅ Taylor Swift's deepfake scandal and what it did for RAIg
✅ Do famous people need to be harmed before we do anything about it?
✅ How to address the deepfake problem at the systemic and symptomatic levels
What can you do?
🎯 Two simple things: like and subscribe. You have no idea how much it will annoy the wrong people if this series gains traction.
🎙️Who are your hosts and why should you even bother to listen?
Upol Ehsan makes AI systems explainable and responsible so that people who aren’t at the table don’t end up on the menu. He is currently at Georgia Tech and had past lives at {Google, IBM, Microsoft} Research. His work pioneered the field of Human-centered Explainable AI.
Shea Brown is an astrophysicist turned AI auditor, working to ensure companies protect ordinary people from the dangers of AI. He’s the Founder and CEO of BABL AI, an AI auditing firm.
All opinions expressed here are strictly the hosts’ personal opinions and do not represent their employers' perspectives.
Follow us for more Responsible AI and the occasional sh*tposting:
Upol: https://twitter.com/UpolEhsan
Shea: https://www.linkedin.com/in/shea-brown-26050465/
CHAPTERS:
0:00 - Introduction
01:20 - Taylor Swift Deepfakes: what happened
02:43 - Does disaster need to strike famous people for us to move the needle?
06:31 - What role can RAI play to address this deepfake problem?
07:19 - Disagreement! Deep fakes have both systemic and symptomatic causes
09:28 - Deep fakes, Elections, EU AI Act, and US State legislations
11:45 - The post-truth era powered by AI
15:40 - Watermarking AI generated content and the difficulty
19:26 - The enshittification of the internet
22:00- Three actionable takeaways
#ResponsibleAI #ExplainableAI #podcasts #aiethics #taylorswift

Support the show

What can you do?
🎯 You have no idea how much it will annoy the wrong people if this series goes viral. So help the algorithm do the work for you!
Follow us for more Responsible AI:
Upol: https://twitter.com/UpolEhsan
Shea: https://www.linkedin.com/in/shea-brown-26050465/

  continue reading

章节

1. 🔥 The Taylor Swift Factor: Deep fakes & Responsible AI | irResponsible AI EP3S01 (00:00:00)

2. Taylor Swift Deepfakes: what happened (00:01:20)

3. Does disaster need to strike famous people for us to move the needle? (00:02:43)

4. Does disaster need to strike famous people for us to move the needle? (00:06:31)

5. Disagreement! Deep fakes have both systemic and symptomatic causes (00:07:19)

6. Deep fakes, Elections, EU AI Act, and US State legislations (00:09:28)

7. The post-truth era powered by AI (00:11:45)

8. Watermarking AI generated content and the difficulty (00:15:40)

9. The enshittification of the internet (00:19:26)

10. Three actionable takeaways (00:22:00)

6集单集

Artwork
icon分享
 
Manage episode 421983738 series 3578042
内容由Upol Ehsan, Shea Brown, Upol Ehsan, and Shea Brown提供。所有播客内容(包括剧集、图形和播客描述)均由 Upol Ehsan, Shea Brown, Upol Ehsan, and Shea Brown 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

Got questions or comments or topics you want us to cover? Text us!

As they say, don't mess with Swifties. This episode irResponsible AI is about the Taylor Swift Factor in Responsible AI:
✅ Taylor Swift's deepfake scandal and what it did for RAIg
✅ Do famous people need to be harmed before we do anything about it?
✅ How to address the deepfake problem at the systemic and symptomatic levels
What can you do?
🎯 Two simple things: like and subscribe. You have no idea how much it will annoy the wrong people if this series gains traction.
🎙️Who are your hosts and why should you even bother to listen?
Upol Ehsan makes AI systems explainable and responsible so that people who aren’t at the table don’t end up on the menu. He is currently at Georgia Tech and had past lives at {Google, IBM, Microsoft} Research. His work pioneered the field of Human-centered Explainable AI.
Shea Brown is an astrophysicist turned AI auditor, working to ensure companies protect ordinary people from the dangers of AI. He’s the Founder and CEO of BABL AI, an AI auditing firm.
All opinions expressed here are strictly the hosts’ personal opinions and do not represent their employers' perspectives.
Follow us for more Responsible AI and the occasional sh*tposting:
Upol: https://twitter.com/UpolEhsan
Shea: https://www.linkedin.com/in/shea-brown-26050465/
CHAPTERS:
0:00 - Introduction
01:20 - Taylor Swift Deepfakes: what happened
02:43 - Does disaster need to strike famous people for us to move the needle?
06:31 - What role can RAI play to address this deepfake problem?
07:19 - Disagreement! Deep fakes have both systemic and symptomatic causes
09:28 - Deep fakes, Elections, EU AI Act, and US State legislations
11:45 - The post-truth era powered by AI
15:40 - Watermarking AI generated content and the difficulty
19:26 - The enshittification of the internet
22:00- Three actionable takeaways
#ResponsibleAI #ExplainableAI #podcasts #aiethics #taylorswift

Support the show

What can you do?
🎯 You have no idea how much it will annoy the wrong people if this series goes viral. So help the algorithm do the work for you!
Follow us for more Responsible AI:
Upol: https://twitter.com/UpolEhsan
Shea: https://www.linkedin.com/in/shea-brown-26050465/

  continue reading

章节

1. 🔥 The Taylor Swift Factor: Deep fakes & Responsible AI | irResponsible AI EP3S01 (00:00:00)

2. Taylor Swift Deepfakes: what happened (00:01:20)

3. Does disaster need to strike famous people for us to move the needle? (00:02:43)

4. Does disaster need to strike famous people for us to move the needle? (00:06:31)

5. Disagreement! Deep fakes have both systemic and symptomatic causes (00:07:19)

6. Deep fakes, Elections, EU AI Act, and US State legislations (00:09:28)

7. The post-truth era powered by AI (00:11:45)

8. Watermarking AI generated content and the difficulty (00:15:40)

9. The enshittification of the internet (00:19:26)

10. Three actionable takeaways (00:22:00)

6集单集

所有剧集

×
 
Loading …

欢迎使用Player FM

Player FM正在网上搜索高质量的播客,以便您现在享受。它是最好的播客应用程序,适用于安卓、iPhone和网络。注册以跨设备同步订阅。

 

快速参考指南