Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Low-Rank Tensor Completion via Novel Sparsity-Inducing Regularizers

About

To alleviate the bias generated by the l1-norm in the low-rank tensor completion problem, nonconvex surrogates/regularizers have been suggested to replace the tensor nuclear norm, although both can achieve sparsity. However, the thresholding functions of these nonconvex regularizers may not have closed-form expressions and thus iterations are needed, which increases the computational loads. To solve this issue, we devise a framework to generate sparsity-inducing regularizers with closed-form thresholding functions. These regularizers are applied to low-tubal-rank tensor completion, and efficient algorithms based on the alternating direction method of multipliers are developed. Furthermore, convergence of our methods is analyzed and it is proved that the generated sequences are bounded and any limit point is a stationary point. Experimental results using synthetic and real-world datasets show that the proposed algorithms outperform the state-of-the-art methods in terms of restoration performance.

Zhi-Yong Wang, Hing Cheung So, Abdelhak M. Zoubir• 2023

Related benchmarks

TaskDatasetResultRank
Low-Rank Tensor CompletionMRI Sampling Rate 0.3%
MPSNR23.88
15
Tensor completionFace datasets 0.1% Sampling Rate
MPSNR18.46
15
Low-Rank Tensor CompletionMRI Sampling Rate 0.1%
MPSNR21.1
15
Low-Rank Tensor CompletionMRI Sampling Rate (0.5%)
MPSNR24.9
15
Low-Rank Tensor CompletionMRI Sampling Rate 1%
MPSNR27.35
15
Tensor completionFace datasets (0.3% Sampling Rate)
MPSNR21.42
15
Low-Rank Tensor CompletionMRSIs SR=0.5% (test)
MPSNR18.32
15
Low-Rank Tensor CompletionMRSIs SR=1% (test)
MPSNR19.75
15
Tensor completionFace datasets 0.5% Sampling Rate
MPSNR22.75
15
Low-Rank Tensor CompletionMRSIs SR=5% (test)
MPSNR23.75
15
Showing 10 of 13 rows

Other info

Follow for update