F. J. Aragón Artacho, P. T. Vuong

The Boosted Difference of Convex functions Algorithm (BDCA) was recently proposed for minimizing smooth difference of convex (DC) functions. BDCA accelerates the convergence of the classical Difference of Convex functions Algorithm (DCA) thanks to an additional line search step. The scheme can be generalized and successfully applied to those functions that can be expressed as the difference of a smooth function and a possibly nonsmooth one. Furthermore, there is complete freedom in the choice of the trial step size for the line search, and this can improve its performance. We show that any limit point of the sequence generated by BDCA is a critical point and the objective value is monotonically decreasing and convergent. The global convergence and convergent rate of the iterations are obtained under the Kurdyka-Lojasiewicz property. Our numerical experiments on the Minimum Sum-of-Squares Clustering and the Multidimensional Scaling problems clearly demonstrate that BDCA outperforms DCA.

Keywords: Difference of convex functions, Boosted Difference of Convex functions Algorithm, Kurdyka-Lojasiewicz property, Clustering problem, Multidimensional Scaling problem

Scheduled

GT11-2 MA-2 Continuous Optimization. Tribute to Marco Antonio López
September 5, 2019  4:05 PM
I2L7. Georgina Blanes building


Other papers in the same session


Cookie policy

We use cookies in order to be able to identify and authenticate you on the website. They are necessary for the correct functioning of it, and therefore they can not be disabled. If you continue browsing the website, you are agreeing with their acceptance, as well as our Privacy Policy.

Additionally, we use Google Analytics in order to analyze the website traffic. They also use cookies and you can accept or refuse them with the buttons below.

You can read more details about our Cookie Policy and our Privacy Policy.