Optimal Regression Trees
R. Blanquero, E. Carrizosa, C. Molero-Río, D. Romero Morales
Classic regression trees are defined by a set of orthogonal cuts, i.e., the branching rules are of the form variable X not lower than threshold c. Oblique cuts, with at least two variables, have also been proposed, leading to generally more accurate and smaller-sized trees. The variables and thresholds are selected by a greedy procedure. The use of a greedy strategy yields low computational cost, but may lead to myopic decisions. The latest advances in Optimization techniques have motivated further research on procedures to build optimal regression trees. In this talk, we propose a continuous optimization approach to tackle this problem. This is achieved by including a cumulative density function that will indicate the path to be followed inside the tree, yielding a random implementation of the cuts. In contrast to classic regression trees, our formulation is flexible enough since sparsity or performance guarantee in a subsample could be easily included.
Palabras clave: Classification and Regression Trees; Optimal Decision Trees; Nonlinear Programming
Programado
AM-2 Análisis Multivariante
4 de septiembre de 2019 14:45
I3L8. Edificio Georgina Blanes
Otros trabajos en la misma sesión
M. J. Ginzo Villamayor, R. Crujeiras Casais, X. Sousa Fernández
E. Pérez Bernabeu, M. Bravo Sellés, B. Aragonés Cuesta
B. Sinova Fernández, S. Van Aelst
Últimas noticias
-
04/07/19
Programa científico completo disponible -
31/05/19
Convocado Premio INE 2019 -
13/04/19
Inscripción ya abierta