Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Correct Normalization Matters: Understanding the Effect of Normalization On Deep Neural Network Models For Click-Through Rate Prediction

About

Normalization has become one of the most fundamental components in many deep neural networks for machine learning tasks while deep neural network has also been widely used in CTR estimation field. Among most of the proposed deep neural network models, few model utilize normalization approaches. Though some works such as Deep & Cross Network (DCN) and Neural Factorization Machine (NFM) use Batch Normalization in MLP part of the structure, there isn't work to thoroughly explore the effect of the normalization on the DNN ranking systems. In this paper, we conduct a systematic study on the effect of widely used normalization schemas by applying the various normalization approaches to both feature embedding and MLP part in DNN model. Extensive experiments are conduct on three real-world datasets and the experiment results demonstrate that the correct normalization significantly enhances model's performance. We also propose a new and effective normalization approaches based on LayerNorm named variance only LayerNorm(VO-LN) in this work. A normalization enhanced DNN model named NormDNN is also proposed based on the above-mentioned observation. As for the reason why normalization works for DNN models in CTR estimation, we find that the variance of normalization plays the main role and give an explanation in this work.

Zhiqiang Wang, Qingyun She, PengTao Zhang, Junlin Zhang• 2020

Related benchmarks

TaskDatasetResultRank
Click-Through Rate PredictionAvazu (test)
AUC0.7869
191
Click-Through Rate PredictionCriteo (test)
AUC0.8107
47
Click-Through Rate PredictionMalware (test)
AUC0.7402
4
Showing 3 of 3 rows

Other info

Follow for update