Artwork

内容由Lukas Biewald提供。所有播客内容(包括剧集、图形和播客描述)均由 Lukas Biewald 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal
Player FM -播客应用
使用Player FM应用程序离线!

Launching the Fastest AI Inference Solution with Cerebras Systems CEO Andrew Feldman

53:14
 
分享
 

Manage episode 436503391 series 2777250
内容由Lukas Biewald提供。所有播客内容(包括剧集、图形和播客描述)均由 Lukas Biewald 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

In this episode of Gradient Dissent, Andrew Feldman, CEO of Cerebras Systems, joins host Lukas Biewald to discuss the latest advancements in AI inference technology. They explore Cerebras Systems' groundbreaking new AI inference product, examining how their wafer-scale chips are setting new benchmarks in speed, accuracy, and cost efficiency. Andrew shares insights on the architectural innovations that make this possible and discusses the broader implications for AI workloads in production. This episode provides a comprehensive look at the cutting-edge of AI hardware and its impact on the future of machine learning.

✅ *Subscribe to Weights & Biases* → https://bit.ly/45BCkYz

🎙 Get our podcasts on these platforms:

Apple Podcasts: http://wandb.me/apple-podcasts

Spotify: http://wandb.me/spotify

Google: http://wandb.me/gd_google

YouTube: http://wandb.me/youtube

Connect with Andrew Feldman:

https://www.linkedin.com/in/andrewdfeldman/

Follow Weights & Biases:

https://twitter.com/weights_biases

https://www.linkedin.com/company/wandb

Join the Weights & Biases Discord Server:

https://discord.gg/CkZKRNnaf3

Paper Andrew referenced Paul David- Economic historian

https://www.jstor.org/stable/2006600

  continue reading

112集单集

Artwork
icon分享
 
Manage episode 436503391 series 2777250
内容由Lukas Biewald提供。所有播客内容(包括剧集、图形和播客描述)均由 Lukas Biewald 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

In this episode of Gradient Dissent, Andrew Feldman, CEO of Cerebras Systems, joins host Lukas Biewald to discuss the latest advancements in AI inference technology. They explore Cerebras Systems' groundbreaking new AI inference product, examining how their wafer-scale chips are setting new benchmarks in speed, accuracy, and cost efficiency. Andrew shares insights on the architectural innovations that make this possible and discusses the broader implications for AI workloads in production. This episode provides a comprehensive look at the cutting-edge of AI hardware and its impact on the future of machine learning.

✅ *Subscribe to Weights & Biases* → https://bit.ly/45BCkYz

🎙 Get our podcasts on these platforms:

Apple Podcasts: http://wandb.me/apple-podcasts

Spotify: http://wandb.me/spotify

Google: http://wandb.me/gd_google

YouTube: http://wandb.me/youtube

Connect with Andrew Feldman:

https://www.linkedin.com/in/andrewdfeldman/

Follow Weights & Biases:

https://twitter.com/weights_biases

https://www.linkedin.com/company/wandb

Join the Weights & Biases Discord Server:

https://discord.gg/CkZKRNnaf3

Paper Andrew referenced Paul David- Economic historian

https://www.jstor.org/stable/2006600

  continue reading

112集单集

所有剧集

×
 
Loading …

欢迎使用Player FM

Player FM正在网上搜索高质量的播客,以便您现在享受。它是最好的播客应用程序,适用于安卓、iPhone和网络。注册以跨设备同步订阅。

 

快速参考指南