Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Resurfacing Paralinguistic Awareness in Large Audio Language Models

About

Large Audio Language Models (LALMs) have expanded the interaction with human to speech modality, which introduces great interactive potential, due to the paralinguistic cues implicitly indicating the user context. However, building on the current content-centred paradigm, LALMs usually neglect such paralinguistic cues and respond solely based on query content. In this work, to resurface the paralinguistic awareness in LALMs, we introduce five diverse layer-wise analyses to jointly identify paralinguistic layers and semantic understanding layers. Based on these insights, we propose a paralinguistic-enhanced fine-tuning (PE-FT) protocol accordingly to equip LALMs with paralinguistic-aware capabilities, including (1) selective-layer fine-tuning, and (2) an auxiliary dual-level classification head. Our experiments demonstrate that PE-FT protocol efficiently and effectively resurfaces the paralinguistic awareness, even surpassing the performance of the all-layer fine-tuning strategy.

Hao Yang, Minghan Wang, Tongtong Wu, Lizhen Qu, Ehsan Shareghi, Gholamreza Haffari• 2026

Related benchmarks

TaskDatasetResultRank
Paralinguistic-aware response generationParalinguistic-aware evaluation set Age
PA-score0.96
9
Paralinguistic-aware response generationParalinguistic-aware Gender (evaluation set)
PA-score97
9
Paralinguistic-aware response generationParalinguistic-aware Emotion (evaluation set)
PA-score0.625
9
General Capability EvaluationVoiceBench
HS Score76.91
8
Showing 4 of 4 rows

Other info

Follow for update