Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Hyper-Connections

About

We present hyper-connections, a simple yet effective method that can serve as an alternative to residual connections. This approach specifically addresses common drawbacks observed in residual connection variants, such as the seesaw effect between gradient vanishing and representation collapse. Theoretically, hyper-connections allow the network to adjust the strength of connections between features at different depths and dynamically rearrange layers. We conduct experiments focusing on the pre-training of large language models, including dense and sparse models, where hyper-connections show significant performance improvements over residual connections. Additional experiments conducted on vision tasks also demonstrate similar improvements. We anticipate that this method will be broadly applicable and beneficial across a wide range of AI problems.

Defa Zhu, Hongzhi Huang, Zihao Huang, Yutao Zeng, Yunyao Mao, Banggu Wu, Qiyang Min, Xun Zhou• 2024

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningHellaSwag
Accuracy74.3
1460
Multi-task Language UnderstandingMMLU
Accuracy63
842
Commonsense ReasoningPIQA
Accuracy79.9
647
Mathematical ReasoningGSM8K
EM53.2
115
Logical reasoningBBH
Accuracy48.9
93
Reading ComprehensionDROP
F1 Score51.6
55
Language ModelingLanguage Modeling Corpus (val)
Average Perplexity9.57
19
Zero-shot Downstream Reasoning and Knowledge TasksDownstream Reasoning Task Suite (ARC-E, ARC-C, HS, OBQA, PIQA, WG, Arith.) zero-shot
ARC-E76.3
19
Showing 8 of 8 rows

Other info

Follow for update