F. J. Aragón Artacho, P. T. Vuong
The Boosted Difference of Convex functions Algorithm (BDCA) was recently proposed for minimizing smooth difference of convex (DC) functions. BDCA accelerates the convergence of the classical Difference of Convex functions Algorithm (DCA) thanks to an additional line search step. The scheme can be generalized and successfully applied to those functions that can be expressed as the difference of a smooth function and a possibly nonsmooth one. Furthermore, there is complete freedom in the choice of the trial step size for the line search, and this can improve its performance. We show that any limit point of the sequence generated by BDCA is a critical point and the objective value is monotonically decreasing and convergent. The global convergence and convergent rate of the iterations are obtained under the Kurdyka-Lojasiewicz property. Our numerical experiments on the Minimum Sum-of-Squares Clustering and the Multidimensional Scaling problems clearly demonstrate that BDCA outperforms DCA.
Palabras clave: Difference of convex functions, Boosted Difference of Convex functions Algorithm, Kurdyka-Lojasiewicz property, Clustering problem, Multidimensional Scaling problem
Programado
GT11-2 MA-2 Optimización Continua. Homenaje a Marco Antonio López
5 de septiembre de 2019 16:05
I2L7. Edificio Georgina Blanes