| Data: |
2021 |
| Resum: |
In regression, the quality of estimators is known to be very sensitive to the presence of spurious variables and outliers. Unfortunately, this is a frequent situation when dealing with real data. To handle outlier proneness and achieve variable selection, we propose a robust method performing the outright rejection of discordant observations together with the selection of relevant variables. A natural way to define the corresponding optimization problem is to use the ℓ0 norm and recast it as a mixed integer optimization problem. To retrieve this global solution more efficiently, we suggest the use of additional constraints as well as a clever initialization. To this end, an efficient and scalable non-convex proximal alternate algorithm is introduced. An empirical comparison between the ℓ0 norm approach and its ℓ1 relaxation is presented as well. Results on both synthetic and real data sets provided that the mixed integer programming approach and its discrete first order warm start provide high quality solutions. |
| Drets: |
Aquest document està subjecte a una llicència d'ús Creative Commons. Es permet la reproducció total o parcial, la distribució, i la comunicació pública de l'obra, sempre que no sigui amb finalitats comercials, i sempre que es reconegui l'autoria de l'obra original. No es permet la creació d'obres derivades.  |
| Llengua: |
Anglès |
| Document: |
Article ; recerca ; Versió publicada |
| Matèria: |
Robust optimization ;
Statistical learning ;
Linear regression ;
Variable selection ;
Outlier detection ;
Mixed integer programming |
| Publicat a: |
SORT : statistics and operations research transactions, Vol. 45 Núm. 1 (January-June 2021) , p. 47-66 (Articles) , ISSN 2013-8830 |