Artwork

内容由Zeta Alpha提供。所有播客内容(包括剧集、图形和播客描述)均由 Zeta Alpha 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal
Player FM -播客应用
使用Player FM应用程序离线!

Task-aware Retrieval with Instructions

1:11:13
 
分享
 

Manage episode 355037182 series 3446693
内容由Zeta Alpha提供。所有播客内容(包括剧集、图形和播客描述)均由 Zeta Alpha 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

Andrew Yates (Assistant Prof at University of Amsterdam) and Sergi Castella (Analyst at Zeta Alpha) discuss the paper "Task-aware Retrieval with Instructions" by Akari Asai et al. This paper proposes to augment a conglomerate of existing retrieval and NLP datasets with natural language instructions (BERRI, Bank of Explicit RetRieval Instructions) and use it to train TART (Multi-task Instructed Retriever).

📄 Paper: https://arxiv.org/abs/2211.09260

🍻 BEIR benchmark: https://arxiv.org/abs/2104.08663

📈 LOTTE (Long-Tail Topic-stratified Evaluation, introduced in ColBERT v2): https://arxiv.org/abs/2112.01488

Timestamps:

00:00 Intro: "Task-aware Retrieval with Instructions"

02:20 BERRI, TART, X^2 evaluation

04:00 Background: recent works in domain adaptation

06:50 Instruction Tuning 08:50 Retrieval with descriptions

11:30 Retrieval with instructions

17:28 BERRI, Bank of Explicit RetRieval Instructions

21:48 Repurposing NLP tasks as retrieval tasks

23:53 Negative document selection

27:47 TART, Multi-task Instructed Retriever

31:50 Evaluation: Zero-shot and X^2 evaluation

39:20 Results on Table 3 (BEIR, LOTTE)

50:30 Results on Table 4 (X^2-Retrieval)

55:50 Ablations

57:17 Discussion: user modeling, future work, scale

  continue reading

21集单集

Artwork
icon分享
 
Manage episode 355037182 series 3446693
内容由Zeta Alpha提供。所有播客内容(包括剧集、图形和播客描述)均由 Zeta Alpha 或其播客平台合作伙伴直接上传和提供。如果您认为有人在未经您许可的情况下使用您的受版权保护的作品,您可以按照此处概述的流程进行操作https://zh.player.fm/legal

Andrew Yates (Assistant Prof at University of Amsterdam) and Sergi Castella (Analyst at Zeta Alpha) discuss the paper "Task-aware Retrieval with Instructions" by Akari Asai et al. This paper proposes to augment a conglomerate of existing retrieval and NLP datasets with natural language instructions (BERRI, Bank of Explicit RetRieval Instructions) and use it to train TART (Multi-task Instructed Retriever).

📄 Paper: https://arxiv.org/abs/2211.09260

🍻 BEIR benchmark: https://arxiv.org/abs/2104.08663

📈 LOTTE (Long-Tail Topic-stratified Evaluation, introduced in ColBERT v2): https://arxiv.org/abs/2112.01488

Timestamps:

00:00 Intro: "Task-aware Retrieval with Instructions"

02:20 BERRI, TART, X^2 evaluation

04:00 Background: recent works in domain adaptation

06:50 Instruction Tuning 08:50 Retrieval with descriptions

11:30 Retrieval with instructions

17:28 BERRI, Bank of Explicit RetRieval Instructions

21:48 Repurposing NLP tasks as retrieval tasks

23:53 Negative document selection

27:47 TART, Multi-task Instructed Retriever

31:50 Evaluation: Zero-shot and X^2 evaluation

39:20 Results on Table 3 (BEIR, LOTTE)

50:30 Results on Table 4 (X^2-Retrieval)

55:50 Ablations

57:17 Discussion: user modeling, future work, scale

  continue reading

21集单集

所有剧集

×
 
Loading …

欢迎使用Player FM

Player FM正在网上搜索高质量的播客,以便您现在享受。它是最好的播客应用程序,适用于安卓、iPhone和网络。注册以跨设备同步订阅。

 

快速参考指南

边探索边听这个节目
播放