Artwork

内容由Zeta Alpha提供。所有播客内容(包括剧集、图形和播客描述)均由 Zeta Alpha 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal
Player FM -播客应用
使用Player FM应用程序离线!

Learning to Retrieve Passages without Supervision: finally unsupervised Neural IR?

59:10
 
分享
 

Manage episode 355037189 series 3446693
内容由Zeta Alpha提供。所有播客内容(包括剧集、图形和播客描述)均由 Zeta Alpha 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

In this third episode of the Neural Information Retrieval Talks podcast, Andrew Yates and Sergi Castella discuss the paper "Learning to Retrieve Passages without Supervision" by Ori Ram et al.

Despite the massive advances in Neural Information Retrieval in the past few years, statistical models still overperform neural models when no annotations are available at all. This paper proposes a new self-supervised pertaining task for Dense Information Retrieval that manages to beat BM25 on some benchmarks without using any label.

Paper: https://arxiv.org/abs/2112.07708

Timestamps:

00:00 Introduction

00:36 "Learning to Retrieve Passages Without Supervision"

02:20 Open Domain Question Answering

05:05 Related work: Families of Retrieval Models

08:30 Contrastive Learning

11:18 Siamese Networks, Bi-Encoders and Dual-Encoders

13:33 Choosing Negative Samples

17:46 Self supervision: how to train IR models without labels.

21:31 The modern recipe for SOTA Retrieval Models

23:50 Methodology: a new proposed self supervision task

26:40 Datasets, metrics and baselines

\33:50 Results: Zero-Shot performance

43:07 Results: Few-shot performance

47:15 Practically, is not using labels relevant after all?

51:37 How would you "break" the Spider model?

53:23 How long until Neural IR models outperform BM25 out-of-the-box robustly?

54:50 Models as a service: OpenAI's text embeddings API

Contact: castella@zeta-alpha.com

  continue reading

13集单集

Artwork
icon分享
 
Manage episode 355037189 series 3446693
内容由Zeta Alpha提供。所有播客内容(包括剧集、图形和播客描述)均由 Zeta Alpha 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

In this third episode of the Neural Information Retrieval Talks podcast, Andrew Yates and Sergi Castella discuss the paper "Learning to Retrieve Passages without Supervision" by Ori Ram et al.

Despite the massive advances in Neural Information Retrieval in the past few years, statistical models still overperform neural models when no annotations are available at all. This paper proposes a new self-supervised pertaining task for Dense Information Retrieval that manages to beat BM25 on some benchmarks without using any label.

Paper: https://arxiv.org/abs/2112.07708

Timestamps:

00:00 Introduction

00:36 "Learning to Retrieve Passages Without Supervision"

02:20 Open Domain Question Answering

05:05 Related work: Families of Retrieval Models

08:30 Contrastive Learning

11:18 Siamese Networks, Bi-Encoders and Dual-Encoders

13:33 Choosing Negative Samples

17:46 Self supervision: how to train IR models without labels.

21:31 The modern recipe for SOTA Retrieval Models

23:50 Methodology: a new proposed self supervision task

26:40 Datasets, metrics and baselines

\33:50 Results: Zero-Shot performance

43:07 Results: Few-shot performance

47:15 Practically, is not using labels relevant after all?

51:37 How would you "break" the Spider model?

53:23 How long until Neural IR models outperform BM25 out-of-the-box robustly?

54:50 Models as a service: OpenAI's text embeddings API

Contact: castella@zeta-alpha.com

  continue reading

13集单集

所有剧集

×
 
Loading …

欢迎使用Player FM

Player FM正在网上搜索高质量的播客,以便您现在享受。它是最好的播客应用程序,适用于安卓、iPhone和网络。注册以跨设备同步订阅。

 

快速参考指南