Robust tensor recovery via a nonconvex approach with ket augmentation and auto-weighted strategy
Wenhui Xie
School of Mathematical Sciences, Soochow University, Suzhou, China
Department of Mathematics, Hangzhou Dianzi University, Hangzhou, China
Search for more papers by this authorChen Ling
Department of Mathematics, Hangzhou Dianzi University, Hangzhou, China
Search for more papers by this authorCorresponding Author
Hongjin He
School of Mathematics and Statistics, Ningbo University, Ningbo, China
Correspondence
Hongjin He, School of Mathematics and Statistics, Ningbo University, Ningbo 315211, China.
Email: [email protected]
Search for more papers by this authorLei-Hong Zhang
School of Mathematical Sciences, Soochow University, Suzhou, China
Search for more papers by this authorWenhui Xie
School of Mathematical Sciences, Soochow University, Suzhou, China
Department of Mathematics, Hangzhou Dianzi University, Hangzhou, China
Search for more papers by this authorChen Ling
Department of Mathematics, Hangzhou Dianzi University, Hangzhou, China
Search for more papers by this authorCorresponding Author
Hongjin He
School of Mathematics and Statistics, Ningbo University, Ningbo, China
Correspondence
Hongjin He, School of Mathematics and Statistics, Ningbo University, Ningbo 315211, China.
Email: [email protected]
Search for more papers by this authorLei-Hong Zhang
School of Mathematical Sciences, Soochow University, Suzhou, China
Search for more papers by this authorAbstract
In this article, we introduce a nonconvex tensor recovery approach, which employs the powerful ket augmentation technique to expand a low order tensor into a high-order one so that we can exploit the advantage of tensor train (TT) decomposition tailored for high-order tensors. Moreover, we define a new nonconvex surrogate function to approximate the tensor rank, and develop an auto-weighted mechanism to adjust the weights of the resulting high-order tensor's TT ranks. To make our approach robust, we add two mode-unfolding regularization terms to enhance the model for the purpose of exploring spatio-temporal continuity and self-similarity of the underlying tensors. Also, we propose an implementable algorithm to solve the proposed optimization model in the sense that each subproblem enjoys a closed-form solution. A series of numerical results demonstrate that our approach works well on recovering color images and videos.
CONFLICT OF INTEREST STATEMENT
This study does not have any conflicts to disclose.
Open Research
DATA AVAILABILITY STATEMENT
Data sharing is not applicable to this article as no new data were created or analyzed in this article.
REFERENCES
- 1Bengua JA, Phien HN, Tuan HD, Do MN. Efficient tensor completion for color image and video recovery: Low-rank tensor train. IEEE Trans Image Process. 2017; 26: 2466–2479.
- 2Chen Y, Huang T, Zhao X. Destriping of multispectral remote sensing image using low-rank tensor decomposition. IEEE J-STARS. 2018; 11: 4950–4967.
- 3de Oliveira RV, Pereira JS. The role of diffusion magnetic resonance imaging in parkinson's disease and in the differential diagnosis with atypical parkinsonism. Radiol Bras. 2017; 50: 250–257.
- 4Jia Z, Ng MK, Song G. Robust quaternion matrix completion with applications to image inpainting. Numer Linear Algebra Appl. 2019; 26:e2245.
- 5Liu J, Musialski P, Wonka P, Ye J. Tensor completion for estimating missing values in visual data. IEEE Trans Pattern Anal Mach Intell. 2013; 35: 208–220.
- 6Signoretto M, Tran DQ, Lathauwer LD, Suykens JAK. Learning with tensors: a framework based on convex optimization and spectral regularization. Mach Learn. 2013; 94: 303–351.
10.1007/s10994-013-5366-3 Google Scholar
- 7Symeonidis P. Clusthosvd: Item recommendation by combining semantically enhanced tag clustering with tensor hosvd. IEEE Trans Syst Man Cybern. 2016; 46: 1240–1251.
10.1109/TSMC.2015.2482458 Google Scholar
- 8Yang J, Zhao X, Ji T, Ma T, Huang T. Low-rank tensor train for tensor robust principal component analysis. Appl Math Comput. 2020; 367:124783.
- 9Yuan Q, Zhang L, Shen H. Hyperspectral image denoising employing a spectral–spatial adaptive total variation model. IEEE Trans Geosci Remote Sens. 2012; 50: 3660–3677.
- 10Zhang X. A nonconvex relaxation approach to low-rank tensor completion. IEEE Trans Neural Netw Learn Syst. 2019; 30: 1659–1671.
- 11Zhang Z, Ling C, He H, Qi L. A tensor train approach for internet traffic data completion. Ann Oper Res. 2021. https://doi.org/10.1007/s10479-021-04147-4
- 12Zhao Q, Zhang L, Cichocki A. Bayesian CP factorization of incomplete tensors with automatic rank determination. IEEE Trans Pattern Anal Mach Intell. 2015; 37: 1751–1763.
- 13Candès EJ, Li X, Ma Y, Wright J. Robust principal component analysis? J ACM. 2011; 58: 11:1–11:37.
- 14Kolda TG, Bader BW. Tensor decompositions and applications. SIAM Rev. 2009; 51: 455–500.
- 15Kilmer ME, Braman KS, Hao N, Hoover RC. Third-order tensors as operators on matrices: A theoretical and computational framework with applications in imaging. SIAM J Matrix Anal Appl. 2013; 34: 148–172.
- 16Oseledets I. Tensor-train decomposition. SIAM J Sci Comput. 2011; 33: 2295–2317.
- 17Gu S, Zhang L, Zuo W, Feng X. Weighted nuclear norm minimization with application to image denoising. IEEE Conference on Computer Vision and Pattern Recognition. Columbus, OH; 2014. p. 2862–2869.
- 18Lu C, Feng J, Chen Y, Liu W, Lin Z, Yan S. Tensor robust principal component analysis with a new tensor nuclear norm. IEEE Trans Pattern Anal Mach Intell. 2020; 42: 925–938.
- 19Semerci O, Hao N, Kilmer ME, Miller EL. Tensor-based formulation and nuclear norm regularization for multienergy computed tomography. IEEE Trans Image Process. 2014; 23: 1678–1693.
- 20Zhang Z, Ely G, Aeron S, Hao N, Kilmer ME. Novel methods for multilinear data completion and de-noising based on tensor-SVD. IEEE Conference on Computer Vision and Pattern Recognition. Columbus, OH; 2014. p. 3842–3849.
- 21Hillar CJ, Lim L. Most tensor problems are np-hard. J ACM. 2013; 6: 1–39.
10.1145/2512329 Google Scholar
- 22Romera-Paredes B, Pontil M. A new convex relaxation for tensor completion. Advances in Neural Information Processing Systems 26 (NIPS 2013); 2013. p. 26.
- 23Yuan M, Zhang CH. On tensor completion via nuclear norm minimization. Found Comput Math. 2016; 16: 1031–1068.
- 24Kilmer ME, Martin CDM. Factorization strategies for third-order tensors. Linear Algebra Appl. 2011; 435: 641–658.
- 25Oseledets I, Tyrtyshnikov E. Tt-cross approximation for multidimensional arrays. Linear Algebra Appl. 2010; 432(1): 70–88.
- 26Gong X, Chen W, Chen J, Ai B. Tensor denoising using low-rank tensor train decomposition. IEEE Signal Process Lett. 2020; 27: 1685–1689.
- 27Dian R, Li S, Fang L. Learning a low tensor-train rank representation for hyperspectral image super-resolution. IEEE Trans Neural Netw Learn Syst. 2019; 30: 2672–2683.
- 28Ko C, Batselier K, Daniel L, Yu W, Wong N. Fast and accurate tensor completion with total variation regularized tensor trains. IEEE Trans Image Process. 2020; 29: 6918–6931.
- 29Zhang Y, Wang Y, Han Z, Chen X, Tang Y. Effective tensor completion via element-wise weighted low-rank tensor train with overlapping ket augmentation. IEEE Trans Circuits Syst Video Technol. 2022; 32: 7286–7300.
- 30Latorre JI. Image compression and entanglement. 2005 arXiv:quant-ph/0510031.
- 31Ding M, Huang T, Ji T-Y, Zhao X, Yang J. Low-rank tensor completion using matrix factorization based on tensor train rank and total variation. J Sci Comput. 2019; 81: 941–964.
- 32Yang J, Zhao X, Ma T, Chen Y, Huang T, Ding M. Remote sensing images destriping using unidirectional hybrid total variation and nonconvex low-rank regularization. J Comput Appl Math. 2020; 363: 124–144.
- 33Zhang P, Geng J, Liu Y, Yang S. Robust principal component analysis based on tensor train rank and schatten p-norm. Vis Comput. 2022; 39: 5849–5867.
- 34Xu W, Zhao X, Ji T, Miao J, Ma T, Wang S, et al. Laplace function based nonconvex surrogate for low-rank tensor completion. Signal Process Image Commun. 2019; 73: 62–69.
- 35Chen C, Qian H, Chen W, Zheng Z, Zhu H. Auto-weighted multi-view constrained spectral clustering. Neurocomputing. 2019; 366: 1–11.
- 36Chen C, Wu Z, Chen Z, Zheng Z, Zhang X. Auto-weighted robust low-rank tensor completion via tensor-train. Inf Sci. 2021; 567: 100–115.
- 37He H, Ling C, Xie W. Tensor completion via a generalized transformed tensor t-product decomposition without t-SVD. J Sci Comput. 2022; 93: 47.
- 38Ling C, Yu G, Qi L, Xu Y. T-product factorization method for internet traffic data completion with spatio-temporal regularization. Comput Optim Appl. 2021; 80: 883–913.
- 39Roughan M, Zhang Y, Willinger W, Qiu L. Spatio-temporal compressive sensing and internet traffic matrices (extended version). IEEE/ACM Trans Networking. 2012; 20(3): 662–676.
- 40Chen L, Sun D, Toh K. An efficient inexact symmetric Gauss-Seidel based majorized ADMM for high-dimensional convex composite conic programming. Math Program Ser A. 2017; 161: 237–270.
- 41Li X, Sun D, Toh K. A schur complement based semi-proximal admm for convex quadratic conic programming and extensions. Math Program. 2016; 155: 333–373.
- 42Gaïffas S, Lecué G. Weighted algorithms for compressed sensing and matrix completion. 2011 arXiv:1107.1638.
- 43Hale ET, Yin W, Zhang Y. Fixed-point continuation for -minimization: Methodology and convergence. SIAM J Optim. 2008; 19: 1107–1130.
- 44Rockafellar RT, Wets RJB. Variational Analysis. Vol 317. Heidelberg: Springer; 2009.
- 45Zheng Y, Huang T, Zhao X, Jiang T, Ji T, Ma T. Tensor n-tubal rank and its convex relaxation for low-rank tensor recovery. Inf Sci. 2020; 532: 170–189.
- 46Qiu D, Bai M, Ng MK, Zhang X. Robust low-rank tensor completion via transformed tensor nuclear norm with total variation regularization. Neurocomputing. 2021; 435: 197–215.
- 47Wang Y, Peng J, Zhao Q, Leung Y, Zhao X, Meng D. Hyperspectral image restoration via total variation regularized low-rank tensor decomposition. IEEE J-STARS. 2018; 11: 1227–1243.
- 48Wang Z, Bovik A, Sheikh H, Simoncelli E. Image quality assessment: from error visibility to structural similarity. IEEE Trans Image Process. 2004; 13(4): 600–612.