首页 | 官方网站   微博 | 高级检索  
     


LPNN-based approach for LASSO problem via a sequence of regularized minimizations
Authors:Anis Zeglaoui  Anouar Houmia  Maher Mejai  Radhouane Aloui
Affiliation:1. MaPSFA-ESST Hammam Sousse, ENISo, University of Sousse, Sousse, Tunisia;2. Department of Mathematics, College of Science, King Khalid University, Abha, Saudi Arabia;3. Department of Mechanics, High Institute for Technological Studies, ISET, Beja, Tunisia
Abstract:In compressive sampling theory, the least absolute shrinkage and selection operator (LASSO) is a representative problem. Nevertheless, the non-differentiable constraint impedes the use of Lagrange programming neural networks (LPNNs). We present in this article the urn:x-wiley:acs:media:acs3303:acs3303-math-0001-LPNN model, a novel algorithm that tackles the LASSO minimization together with the underlying theory support. First, we design a sequence of smooth constrained optimization problems, by introducing a convenient differentiable approximation to the non-differentiable urn:x-wiley:acs:media:acs3303:acs3303-math-0002-norm constraint. Next, we prove that the optimal solutions of the regularized intermediate problems converge to the optimal sparse signal for the LASSO. Then, for every regularized problem from the sequence, the urn:x-wiley:acs:media:acs3303:acs3303-math-0003-LPNN dynamic model is derived, and the asymptotic stability of its equilibrium state is established as well. Finally, numerical simulations are carried out to compare the performance of the proposed urn:x-wiley:acs:media:acs3303:acs3303-math-0004-LPNN algorithm with both the LASSO-LPNN model and a standard digital method.
Keywords:convergence  LASSO  LPNN  Lyapunov stability  regularized minimization problems  sparse coding
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号