Artwork

内容由Singularity University提供。所有播客内容(包括剧集、图形和播客描述)均由 Singularity University 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal
Player FM -播客应用
使用Player FM应用程序离线!

FBL91: Connor Leahy - The Existential Risk of AI Alignment

53:40
 
分享
 

Manage episode 355896482 series 2528126
内容由Singularity University提供。所有播客内容(包括剧集、图形和播客描述)均由 Singularity University 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

This week our guest is AI researcher and founder of Conjecture, Connor Leahy, who is dedicated to studying AI alignment. Alignment research focuses on gaining an increased understanding of how to build advanced AI systems that pursue the goals they were designed for instead of engaging in undesired behavior. Sometimes, this means just ensuring they share the values and ethics we have as humans so that our machines don’t cause serious harm to humanity.

In this episode, Connor provides candid insights into the current state of the field, including the very concerning lack of funding and human resources that are currently going into alignment research. Amongst many other things, we discuss how the research is conducted, the lessons we can learn from animals, and the kind of policies and processes humans need to put into place if we are to prevent what Connor currently sees as a highly plausible existential threat.

Find out more about Conjecture at conjecture.dev or follow Connor and his work at twitter.com/NPCollapse

**

Apply for registration to our exclusive South By Southwest event on March 14th @ www.su.org/basecamp-sxsw

Apply for an Executive Program Scholarship at su.org/executive-program/ep-scholarship

Learn more about Singularity: su.org

Host: Steven Parton - LinkedIn / Twitter

Music by: Amine el Filali

  continue reading

120集单集

Artwork
icon分享
 
Manage episode 355896482 series 2528126
内容由Singularity University提供。所有播客内容(包括剧集、图形和播客描述)均由 Singularity University 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

This week our guest is AI researcher and founder of Conjecture, Connor Leahy, who is dedicated to studying AI alignment. Alignment research focuses on gaining an increased understanding of how to build advanced AI systems that pursue the goals they were designed for instead of engaging in undesired behavior. Sometimes, this means just ensuring they share the values and ethics we have as humans so that our machines don’t cause serious harm to humanity.

In this episode, Connor provides candid insights into the current state of the field, including the very concerning lack of funding and human resources that are currently going into alignment research. Amongst many other things, we discuss how the research is conducted, the lessons we can learn from animals, and the kind of policies and processes humans need to put into place if we are to prevent what Connor currently sees as a highly plausible existential threat.

Find out more about Conjecture at conjecture.dev or follow Connor and his work at twitter.com/NPCollapse

**

Apply for registration to our exclusive South By Southwest event on March 14th @ www.su.org/basecamp-sxsw

Apply for an Executive Program Scholarship at su.org/executive-program/ep-scholarship

Learn more about Singularity: su.org

Host: Steven Parton - LinkedIn / Twitter

Music by: Amine el Filali

  continue reading

120集单集

所有剧集

×
 
Loading …

欢迎使用Player FM

Player FM正在网上搜索高质量的播客,以便您现在享受。它是最好的播客应用程序,适用于安卓、iPhone和网络。注册以跨设备同步订阅。

 

快速参考指南

边探索边听这个节目
播放