F. J. Aragón Artacho, P. T. Vuong
The Boosted Difference of Convex functions Algorithm (BDCA) was recently proposed for minimizing smooth difference of convex (DC) functions. BDCA accelerates the convergence of the classical Difference of Convex functions Algorithm (DCA) thanks to an additional line search step. The scheme can be generalized and successfully applied to those functions that can be expressed as the difference of a smooth function and a possibly nonsmooth one. Furthermore, there is complete freedom in the choice of the trial step size for the line search, and this can improve its performance. We show that any limit point of the sequence generated by BDCA is a critical point and the objective value is monotonically decreasing and convergent. The global convergence and convergent rate of the iterations are obtained under the Kurdyka-Lojasiewicz property. Our numerical experiments on the Minimum Sum-of-Squares Clustering and the Multidimensional Scaling problems clearly demonstrate that BDCA outperforms DCA.
Keywords: Difference of convex functions, Boosted Difference of Convex functions Algorithm, Kurdyka-Lojasiewicz property, Clustering problem, Multidimensional Scaling problem
Scheduled
GT11-2 MA-2 Continuous Optimization. Tribute to Marco Antonio López
September 5, 2019 4:05 PM
I2L7. Georgina Blanes building