Nonmonotone Spectral Gradient Method for l_1-regularized Least Squares

  • Wanyou Cheng Donggguan University of Technology
  • Qingjie Hu
Keywords: l_1 minimization, compressive sensing, projection gradient

Abstract

In the paper, we investigate a linear constraint optimization reformulation to a more general form of the l_1 regularization problem and give some good properties of it. We first show that the equivalence between the linear constraint optimization problem and the l_1 regularization problem. Second, the KKT point of the linear constraint problem always exists since the constraints are linear; we show that the half constraints must be active at any KKT point. In addition, we show that the KKT points of the linear constraint problem are the same as the stationary points of the l_1 regularization problem. Based on the linear constraint optimization problem, we propose a nonomotone spectral gradient method and establish its global convergence. Numerical experiments with compressive sense problems show that our approach is competitive with several known methods for standard l_2-l_1 problem.

Author Biography

Qingjie Hu
Guilin University of Electronic Technology

References

M. Afonso, J. Bioucas-Dias and M. Figueiredo, Fast image recovery using variable splitting and constrained optimization, IEEE Trans. Image Process., 19 (2010), pp. 2345-2356.

S. Aybat and G. Iyengar, A first-order smoothed penalty method for compressed sensing, SIAM J. Optim., 21 (2011), pp. 287-313.

J. Barzilai and J.M. Borwein, Two point step size gradient methods, IMA J. Numer. Anal., 8 (1988), pp. 141-148.

A. Beck and M. Teboulle, A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM J. Imaging Sci., 2(2009), pp. 183-202.

A. Beck and Y. C. Eldar, Sparsity constrained nonlinear optimization: optimality conditions and algorithms, SIAM J. Optim., 23 (2013), pp. 1480-1509.

J.M. Bioucas-Dias and M.A.T. Figueiredo, A new twist: two-step iterative shrinkage/thresholding algorithms for image restoration, IEEE Trans. Image Process., 16 (2007), pp. 2992-3004.

E.G. Birgin, J.M. Mart´ ınez and M. Raydan, Nonmonotone spectral projected gradient methods on convex sets. SIAM J. Optim., 10 (2000), pp. 1196-1121.

D. Boley, Local linear convergence of the alternating direction method of multipliers on quadratic or linear program. SIAM J. Optim.,

(2013), pp. 2183-2207.

A. M. Bruckstein, D. L. Donoho and M. Elad, From sparse solutions of systems of equations to sparse modeling of signals and images, SIAM Rev., 51 (2009), pp. 34-81.

J.V. Burke and J.J. More, On the identification of active constraints, SIAM J. Numer. Anal., 25 (1988), pp. 1197-1211. ´

E. Cands and J. Romberg, ℓ1-magic: A collection of MATLAB Routines for Solving the Convex Optimization Programs Central to Compressive Sampling 2006 [Online]. Available: www.acm.caltech. edu/l1magic/

W.Y. Cheng and Z.X. Chen, Nonmonotone spectral method for large-scale symmetric nonlinear equations., Numer. Alg., 62 (2013), pp. 149-162.

I. Daubechies, M. Defrise and C.D. Mol, An iterative thresholding algorithm for linear inverse problems with a sparsity constraint, Comm. Pure Appl. Math., 57 (2004), pp. 1413-1457.

Y.H. Dai and R. Fletcher, Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming. Numer. Math., 100 (2005), pp. 21-47.

Y.H. Dai, W.W.Hager, K. Schittkowski and H. Zhang, The cyclic Barzilar-Borwein method for unconstrained optimization, IMA J. Numer. Anal., 26 (2006), pp. 604-627.

G. Davis, S. Mallat and M. Avellaneda, Greedy adaptive approximation, Constr. Approx., 12 (1997), pp. 57-98.

D. Donoho, Compressed sensing, IEEE Trans. Inform. Theory., 52 (2006), pp. 1289-1306.

D. Donoho, M. Elad and V. Temlyakov, Stable recovery of sparse overcomplete representations in the presence of noise, IEEE Trans. Inform. Theory., 52 (2006), pp. 6-18.

E.D. Dolan and J.J. More, Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), pp. 201-213. ´

M. Elad, B. Matalon and M. Zibulevsky, Subspace optimization methods for linear least squares with non-quadratic regularization, Appl. Comput. Harmon. Anal., 23 (2007), pp. 346-367.

M.A. T. Figueiredo and R.D. Nowak, An EM algorithm for wavelet-based image restoration, IEEE Trans. Image Process., 12 (2003), pp. 906-916.

M.A. T. Figueiredo, R.D. Nowak and S.J. Wright, Gradient projection for sparse reconstruction: Application to compressed sensing and other inverse problems. IEEE J. Sel. Top. Signal Process., 1 (2007), pp. 586-597.

L. Grippo, F. Lampariello and S. Lucidi, A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal., 23 (1986), pp.707-716.

E.T. Hale, W. Yin and Y. Zhang, Fixed-point continuation for ℓ1 minimization: methodology and convergence, SIAM J. Optim., 19 (2008), pp. 1107-1130.

W.W. Hager, D.T. Phan and H. Zhang, Gradient-based methods for sparse recovery. SIAM J. Imaging Sci., 4 (2011), pp. 146-165.

S.J. Kim, K. Koh, M. Lustig, S. Boyd and D. Gorinevsky, An interior-point method for large-scale ℓ1-regularized least squares, IEEE J. Sel. Top. Signal Process., 1 (2007), pp. 606-617.

Y. Nesterov, gradient methods for minimizing composite objective function, 2007, CORE Discussion Paper 2007/76 [Online]. Available: http://www.optimization-online.org/DB HTML/2007/09/1784.html

R.T. Rockafellar, Convex Analysis, Princeton University Press, Princeton, NJ, 1970.

S.M. Robinson, Linear convergence of ϵ-subgradient descent methods for a class of convex functions. Math. Program. 86 (1999), pp. 41-50.

M. Saunders, PDCO: Primal-Dual Interior Method for Convex Objectives 2002 [Online]. Available:

http://www.stanford.edu/group/SOL/software/pdco.html

P. Tseng and S.W. Yun, A coordinate gradient descent method for nonsmooth separable minimization. Math. Program., 117 (2009), pp. 387-423.

J. Tropp, Greed is good: Algorithmic results for sparse approximation, IEEE Trans. Inform. Theory, 50 (2006), pp. 2231-2342

Z.W. Wen, W.T. Yin, D. Goldfarb and Y. Zhang, A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization, and continuation. SIAM J. Sci. Comput., 32 (2010), pp. 1832-1857.

Z.W. Wen, W.T. Yin, H. Zhang and D. Goldfarb, On the convergence of an active-set method for ℓ1 minimization. Optim. Methods Softw., 27 (2012), pp. 1127-1146.

J. Wright, R.D. Nowak and M.A.T. Figueiredo, Sparse reconstruction by separable approximation, IEEE Trans. Signal Process., 57 (2009), pp. 2479-2493.

W. Yin, S. Osher, D. Goldfarb and J. Darbon, Bregman iterative algorithms for ℓ1-minimization with applications to compressed sensing, SIAM J. Imaging Sci., 1 (2008), pp. 1433–1468.

Published
2016-08-30
How to Cite
Cheng, W., & Hu, Q. (2016). Nonmonotone Spectral Gradient Method for l_1-regularized Least Squares. Statistics, Optimization & Information Computing, 4(3), 265-277. https://doi.org/10.19139/soic.v4i3.230
Section
Research Articles