Abstract |
The minimization problems where the objective function is the sum of some separable functions and the constraint is linear receive more and more attentions in recent years, and many e_cient numerical algorithms were proposed. While there are a lot of convergence analysis for the convex case, the convergence of these algorithms for the the nonconvex case is still open and the research for this case is in its infancy. Most recently, there is some progress, and the Kurdyka- Lojasiewicz inequality plays a key role. In this talk, we give a review on the Kurdyka-Lojasiewicz inequality and introduce its applications in the convergence analysis in various types of algorithms. Finally, we report our result on proving the classic alternating direction method of multipliers (ADMM) for the nonconvex separable optimization problems. Specially, we prove that any cluster point of the iterative sequence generated by ADMM is a solution point, provided that the penalty parameter is greater than 2L, where L is the Lipschitz constant of the gradient of one of the involving function. Under some further conditions on the problem's data, we also analyze the rate of convergence of the algorithm. |