Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Skim-Aware Contrastive Learning for Efficient Document Representation

About

Although transformer-based models have shown strong performance in word- and sentence-level tasks, effectively representing long documents, especially in fields like law and medicine, remains difficult. Sparse attention mechanisms can handle longer inputs, but are resource-intensive and often fail to capture full-document context. Hierarchical transformer models offer better efficiency but do not clearly explain how they relate different sections of a document. In contrast, humans often skim texts, focusing on important sections to understand the overall message. Drawing from this human strategy, we introduce a new self-supervised contrastive learning framework that enhances long document representation. Our method randomly masks a section of the document and uses a natural language inference (NLI)-based contrastive objective to align it with relevant parts while distancing it from unrelated ones. This mimics how humans synthesize information, resulting in representations that are both richer and more computationally efficient. Experiments on legal and biomedical texts confirm significant gains in both accuracy and efficiency.

Waheed Ahmed Abro, Zied Bouraoui• 2025

Related benchmarks

TaskDatasetResultRank
Document ClassificationECHR
Macro F1 Score57.74
16
Document ClassificationSCOTUS
Macro F159.45
16
Document ClassificationEURLEX
Macro F143.24
16
Document ClassificationMIMIC
Macro F163.89
16
Single-label multi-class topic classificationSCOTUS (test)
Micro-F177.5
12
Document ClassificationBioASQ
Macro F171.28
8
Document ClassificationECHR (test)
Micro F172.6
5
Showing 7 of 7 rows

Other info

Follow for update