Artwork

内容由Soroush Pour提供。所有播客内容(包括剧集、图形和播客描述)均由 Soroush Pour 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal
Player FM -播客应用
使用Player FM应用程序离线!

Ep 8 - Getting started in AI safety & alignment w/ Jamie Bernardi (AI Safety Lead, BlueDot Impact)

1:07:23
 
分享
 

Manage episode 379970186 series 3428190
内容由Soroush Pour提供。所有播客内容(包括剧集、图形和播客描述)均由 Soroush Pour 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

We speak with Jamie Bernardi, co-founder & AI Safety Lead at not-for-profit BlueDot Impact, who host the biggest and most up-to-date courses on AI safety & alignment at AI Safety Fundamentals (https://aisafetyfundamentals.com/). Jamie completed his Bachelors (Physical Natural Sciences) and Masters (Physics) at the U. Cambridge and worked as an ML Engineer before co-founding BlueDot Impact.
The free courses they offer are created in collaboration with people on the cutting edge of AI safety, like Richard Ngo at OpenAI and Prof David Kreuger at U. Cambridge. These courses have been one of the most powerful ways for new people to enter the field of AI safety, and I myself (Soroush) have taken AGI Safety Fundamentals 101 — an exceptional course that was crucial to my understanding of the field and can highly recommend. Jamie shares why he got into AI safety, some recent history of the field, an overview of the current field, and how listeners can get involved and start contributing to a ensure a safe & positive world with advanced AI and AGI.
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- About Jamie --
* Website: https://jamiebernardi.com/
* Twitter: https://twitter.com/The_JBernardi
* BlueDot Impact: https://www.bluedotimpact.org/
-- Further resources --
* AI Safety Fundamentals courses: https://aisafetyfundamentals.com/
* Donate to LTFF to support AI safety initiatives: https://funds.effectivealtruism.org/funds/far-future
* Jobs + opportunities in AI safety:
* https://aisafetyfundamentals.com/opportunities
* https://jobs.80000hours.org
* Horizon Fellowship for policy training in AI safety: https://www.horizonpublicservice.org/fellowship
Recorded Sep 7, 2023

  continue reading

15集单集

Artwork
icon分享
 
Manage episode 379970186 series 3428190
内容由Soroush Pour提供。所有播客内容(包括剧集、图形和播客描述)均由 Soroush Pour 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

We speak with Jamie Bernardi, co-founder & AI Safety Lead at not-for-profit BlueDot Impact, who host the biggest and most up-to-date courses on AI safety & alignment at AI Safety Fundamentals (https://aisafetyfundamentals.com/). Jamie completed his Bachelors (Physical Natural Sciences) and Masters (Physics) at the U. Cambridge and worked as an ML Engineer before co-founding BlueDot Impact.
The free courses they offer are created in collaboration with people on the cutting edge of AI safety, like Richard Ngo at OpenAI and Prof David Kreuger at U. Cambridge. These courses have been one of the most powerful ways for new people to enter the field of AI safety, and I myself (Soroush) have taken AGI Safety Fundamentals 101 — an exceptional course that was crucial to my understanding of the field and can highly recommend. Jamie shares why he got into AI safety, some recent history of the field, an overview of the current field, and how listeners can get involved and start contributing to a ensure a safe & positive world with advanced AI and AGI.
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- About Jamie --
* Website: https://jamiebernardi.com/
* Twitter: https://twitter.com/The_JBernardi
* BlueDot Impact: https://www.bluedotimpact.org/
-- Further resources --
* AI Safety Fundamentals courses: https://aisafetyfundamentals.com/
* Donate to LTFF to support AI safety initiatives: https://funds.effectivealtruism.org/funds/far-future
* Jobs + opportunities in AI safety:
* https://aisafetyfundamentals.com/opportunities
* https://jobs.80000hours.org
* Horizon Fellowship for policy training in AI safety: https://www.horizonpublicservice.org/fellowship
Recorded Sep 7, 2023

  continue reading

15集单集

所有剧集

×
 
Loading …

欢迎使用Player FM

Player FM正在网上搜索高质量的播客,以便您现在享受。它是最好的播客应用程序,适用于安卓、iPhone和网络。注册以跨设备同步订阅。

 

快速参考指南

边探索边听这个节目
播放