# Regularized gradient-projection methods for finding the minimum-norm solution of the constrained convex minimization problem

Journal of Inequalities and Applications, Jan 2017

Let H be a real Hilbert space and C be a nonempty closed convex subset of H. Assume that g is a real-valued convex function and the gradient ∇g is 1 L -ism with L > 0 . Let 0 < λ < 2 L + 2 , 0 < β n < 1 . We prove that the sequence { x n } generated by the iterative algorithm x n + 1 = P C ( I − λ ( ∇ g + β n I ) ) x n , ∀ n ≥ 0 converges strongly to q ∈ U , where q = P U ( 0 ) is the minimum-norm solution of the constrained convex minimization problem, which also solves the variational inequality 〈 − q , p − q 〉 ≤ 0 , ∀ p ∈ U . Under suitable conditions, we obtain some strong convergence theorems. As an application, we apply our algorithm to solving the split feasibility problem in Hilbert spaces. MSC: 58E35, 47H09, 65J15.

This is a preview of a remote PDF: http://www.journalofinequalitiesandapplications.com/content/pdf/s13660-016-1289-4.pdf

Ming Tian, Hui-Fang Zhang. Regularized gradient-projection methods for finding the minimum-norm solution of the constrained convex minimization problem, Journal of Inequalities and Applications, 2017, 13, DOI: 10.1186/s13660-016-1289-4