Previous Articles     Next Articles

Limited Memory BFGS Method for Least Squares Semidefinite Programming with Banded Structure

XUE Wenjuan1, SHEN Chungen2, YU Zhensheng2   

  1. 1. School of Mathematics and Physics, Shanghai University of Electric Power, Shanghai 200090, China;
    2. University of Shanghai for Science and Technology, Shanghai 200093, China
  • Received:2020-02-18 Revised:2021-02-10 Online:2022-08-25 Published:2022-08-02
  • Supported by:
    This research was supported by the National Natural Science Foundation of China under Grant No. 11601318.

XUE Wenjuan, SHEN Chungen, YU Zhensheng. Limited Memory BFGS Method for Least Squares Semidefinite Programming with Banded Structure[J]. Journal of Systems Science and Complexity, 2022, 35(4): 1500-1519.

This work is intended to solve the least squares semidefinite program with a banded structure. A limited memory BFGS method is presented to solve this structured program of high dimension. In the algorithm, the inverse power iteration and orthogonal iteration are employed to calculate partial eigenvectors instead of full decomposition of n×n matrices. One key feature of the algorithm is that it is proved to be globally convergent under inexact gradient information. Preliminary numerical results indicate that the proposed algorithm is comparable with the inexact smoothing Newton method on some large instances of the structured problem.
[1] Cai T T and Yuan M, Adaptive covariance matrix estimation through block thresholding, Ann. Stat., 2012, 40(4): 2014-2042.
[2] Wu W B and Pourahmadi M, Nonparametric estimation of large covariance matrices of longitudinal data, Biometrika, 2003, 90(4): 831-844.
[3] Wu W B and Pourahmadi M, Banding sample autocovariance matrices of stationary processes, Stat. Sin., 2009, 19(4): 1755-1768.
[4] Yu G and Bien J, Learning local dependence in ordered data, J. Mach. Learn. Res., 2017, 18(42): 1-60.
[5] Abramovich Y I, Spencer N K, and Turley M D, Time-varying autoregressive (TVAR) models for multiple radar observations, IEEE Trans. Signal Processing, 2007, 55(4): 1298-1311.
[6] Asif A and Moura J M F, Block matrices with L-block-banded inverse: Inversion algorithms, IEEE Trans. Signal Processing, 2005, 53(2): 630-642.
[7] Bickel P J and Levina E, Regularized estimation of large covariance matrices, Ann. Stat., 2008, 36(1): 199-227.
[8] Kavcic A and Moura J M F, Matrices with banded inverses: Inversion algorithms and factorization of Gauss-Markov processes, IEEE Trans. Info. Theory, 2000, 46(4): 1495-1509.
[9] Chen W, Reilly J P, and Wong K M, Detection of the number of signals in noise with banded covariance matrices, IEEE Proceedings-Radar, Sonar, and Navigation, 1996, 143: 289-294.
[10] Borsdorf B and Higham N J, A preconditioned Newton algorithm for the nearest correlation matrix, IMA J. Numer. Anal., 2010, 30: 94-107.
[11] Gao Y and Sun D F, Calibrating least squares semidefinite programming with equality and inequality constraints, SIAM J. Matrix Anal. Appl., 2009, 31: 1432-1457.
[12] He B S, Xu M H, and Yuan X M, Solving large-scale least squares covariance matrix problems by alternating direction methods, SIAM J. Matrix Anal. Appl., 2011, 32: 136-152.
[13] Higham N J, Computing the nearest correlation matrix — A problem from finance, IMA J. Numer. Anal., 2002, 22: 329-343.
[14] Qi H D and Sun D F, A quadratically convergent Newton method for computing the nearest correlation matrix, SIAM J. Matrix Anal. Appl., 2006, 28: 360-385.
[15] Sturm J F, Using SeDuMi 1.02, a Matlab toolbox for optimization over symmetric cones, Optim. Methods Softw., 1999, 11: 625-653.
[16] Tütüncü R H, Toh K C, and Todd M J, Solving semidefinite-quadratic-linear programs using SDPT3, Math. Program., 2003, 95: 189-217.
[17] Dykstra R L, An algorithm for restricted least squares regression, J. Amer. Statist. Assoc., 1983, 78: 837-842.
[18] Boyd S and Xiao L, Least squares covariance matrix adjustment, SIAM J. Matrix Anal. Appl., 2005, 27: 532-546.
[19] Malick J, A dual approach to semidefinite least squares problems, SIAM J. Matrix Anal. Appl., 2004, 26: 272-284.
[20] Shen C G, Fan C X, Wang Y L, et al., Limited memory BFGS algorithm for the matrix approximation problem in Frobenius norm, Comput. Appl. Math., 2020, 39: 43.
[21] Shen C G, Wang Y L, Xue W J, et al., An accelerated active-set algorithm for a quadratic semidefinite program with general constraints, Comput. Optim. Appl., 2021, 78: 1-42.
[22] Li Q N and Li D H, A projected semismooth Newton method for problems of calibrating least squares covariance matrix, Oper. Res. Lett., 2011, 39: 103-108.
[23] Sun Y F and Vandenberghe L, Decomposition methods for sparse matrix nearness problems, SIAM J. Matrix Anal. Appl., 2015, 36: 1691-1717.
[24] Schwertman N C and Allen D M, Smoothing an indefinite variance-covariance matrix, J. Stat. Comput. Simul., 1979, 9: 183-194.
[25] Nocedal J and Wright S J, Numerical Optimization, 2nd Edition, Springer, New York, 2006.
[26] Golub G H and VanLoan C F, Matrix Computations, 3rd Edition. The Johns Hopkins University Press, Baltimore, MD, 1996.
[27] Liu D C and Nocedal J, On the limited memory BFGS method for large-scale optimization, Math. Prog., 1989, 45: 503-528.
[28] Rockafellar R T, Conjugate Duality and Optimization, CBMS-NSF Regional Conf. Ser. in Appl. Math. 16, SIAM, Philadelphia, 1974.
[29] Zhang H and Hager H H, A nonmonotone line search technique and its application to unconstrained optimization, SIAM J. Optim., 2004, 14: 1043-1056.
[30] Polizzi E, Density-matrix-based algorithms for solving eigenvalue problems, Phys. Rev. B, 2009, 79: 115112.
[31] FEAST solver, 2009-2015.
[32] Qi L Q, Convergence analysis of some algorithms for solving nonsmooth equations, Math. Oper. Res., 1993, 18: 227-244.
No related articles found!
Full text