Artwork

内容由Slator提供。所有播客内容(包括剧集、图形和播客描述)均由 Slator 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal
Player FM -播客应用
使用Player FM应用程序离线!

#175 Where DeepL Beats ChatGPT with Graham Neubig

38:53
 
分享
 

Manage episode 371283883 series 2975363
内容由Slator提供。所有播客内容(包括剧集、图形和播客描述)均由 Slator 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

In this week’s SlatorPod, we are joined by Graham Neubig, Associate Professor of Computer Science at Carnegie Mellon University, to discuss his research on multilingual natural language processing (NLP) and machine translation (MT).
Graham discusses the research at Neulab, where they focus on various areas of NLP, including incorporating broad knowledge bases into NLP models and code generation.
Graham expands on his Zeno GPT-MT Report comparing large language models (LLMs) with special-purpose machine translation models like Google Translate, Microsoft Translate, and DeepL. He revealed that GPT-4 was competitive from English to other languages, but struggled with very long sentences.
When it comes to cost comparison, Graham highlights that GPT-3.5 Turbo (the model behind the free version of ChatGPT) is significantly cheaper than Google Translate and Microsoft Translator, but GPT-4 (available via OpenAI’s subscription) is more expensive.
Graham predicts that companies will likely move towards using general-purpose LLMs and fine-tuning them for specific tasks like translation. The discussion also covers the recent flurry of speech-to-speech machine translation system releases.
Graham talks about his startup, Inspired Cognition, which aims to provide tools for building and improving AI systems, particularly in text and code generation. Graham concludes the pod with advice for new graduates in the NLP field and his plans for Zeno and the Zeno report.

  continue reading

章节

1. Intro and Agenda (00:00:00)

2. Professional Background and Interest in Language (00:01:05)

3. Research at NeuLab (00:03:56)

4. Impact of ChatGPT on NLP (00:05:05)

5. Context in Machine Translation and LLMs (00:07:20)

6. How GPT Handles Machine Translation (00:12:43)

7. GPT Cost Comparison for Machine Translation (00:19:13)

8. How LLMs Will Evolve (00:23:07)

9. Why so Many Speech Translation Releases? (00:24:57)

10. LLMs and Low-Resource Languages (00:29:45)

11. Launching Inspired Cognition (00:32:17)

12. Advice to Graduate Students (00:35:29)

13. Plans for 2023 and Beyond (00:37:15)

216集单集

Artwork
icon分享
 
Manage episode 371283883 series 2975363
内容由Slator提供。所有播客内容(包括剧集、图形和播客描述)均由 Slator 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

In this week’s SlatorPod, we are joined by Graham Neubig, Associate Professor of Computer Science at Carnegie Mellon University, to discuss his research on multilingual natural language processing (NLP) and machine translation (MT).
Graham discusses the research at Neulab, where they focus on various areas of NLP, including incorporating broad knowledge bases into NLP models and code generation.
Graham expands on his Zeno GPT-MT Report comparing large language models (LLMs) with special-purpose machine translation models like Google Translate, Microsoft Translate, and DeepL. He revealed that GPT-4 was competitive from English to other languages, but struggled with very long sentences.
When it comes to cost comparison, Graham highlights that GPT-3.5 Turbo (the model behind the free version of ChatGPT) is significantly cheaper than Google Translate and Microsoft Translator, but GPT-4 (available via OpenAI’s subscription) is more expensive.
Graham predicts that companies will likely move towards using general-purpose LLMs and fine-tuning them for specific tasks like translation. The discussion also covers the recent flurry of speech-to-speech machine translation system releases.
Graham talks about his startup, Inspired Cognition, which aims to provide tools for building and improving AI systems, particularly in text and code generation. Graham concludes the pod with advice for new graduates in the NLP field and his plans for Zeno and the Zeno report.

  continue reading

章节

1. Intro and Agenda (00:00:00)

2. Professional Background and Interest in Language (00:01:05)

3. Research at NeuLab (00:03:56)

4. Impact of ChatGPT on NLP (00:05:05)

5. Context in Machine Translation and LLMs (00:07:20)

6. How GPT Handles Machine Translation (00:12:43)

7. GPT Cost Comparison for Machine Translation (00:19:13)

8. How LLMs Will Evolve (00:23:07)

9. Why so Many Speech Translation Releases? (00:24:57)

10. LLMs and Low-Resource Languages (00:29:45)

11. Launching Inspired Cognition (00:32:17)

12. Advice to Graduate Students (00:35:29)

13. Plans for 2023 and Beyond (00:37:15)

216集单集

Alle episoder

×
 
Loading …

欢迎使用Player FM

Player FM正在网上搜索高质量的播客,以便您现在享受。它是最好的播客应用程序,适用于安卓、iPhone和网络。注册以跨设备同步订阅。

 

快速参考指南