Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

ReadTwice: Reading Very Large Documents with Memories

About

Knowledge-intensive tasks such as question answering often require assimilating information from different sections of large inputs such as books or article collections. We propose ReadTwice, a simple and effective technique that combines several strengths of prior approaches to model long-range dependencies with Transformers. The main idea is to read text in small segments, in parallel, summarizing each segment into a memory table to be used in a second read of the text. We show that the method outperforms models of comparable size on several question answering (QA) datasets and sets a new state of the art on the challenging NarrativeQA task, with questions about entire books. Source code and pre-trained checkpoints for ReadTwice can be found at https://goo.gle/research-readtwice.

Yury Zemlyanskiy, Joshua Ainslie, Michiel de Jong, Philip Pham, Ilya Eckstein, Fei Sha• 2021

Related benchmarks

TaskDatasetResultRank
Question AnsweringTriviaQA Wikipedia domain
EM76.86
6
Showing 1 of 1 rows

Other info

Follow for update