We propose a generalization of the linear panel quantile regression model to accommodate both sparse and dense parts: sparse means that while the number of covariates available is large, potentially only a much smaller number of them have a nonzero impact on each conditional quantile of the response variable; while the dense part is represent by a low-rank matrix that can be approximated by latent factors and their loadings. Such a structure poses problems for traditional sparse estimators, such as the $\ell_1$-penalised Quantile Regression, and for traditional latent factor estimators such as PCA. We propose a new estimation procedure, based on the ADMM algorithm, that consists of combining the quantile loss function with $\ell_1$ and nuclear norm regularization. We show, under general conditions, that our estimator can consistently estimate both the nonzero coefficients of the covariates and the latent low-rank matrix. This is done in a challenging setting that allows for temporal dependence, heavy-tail distributions, and the presence of latent factors. Our proposed model has a "Characteristics + Latent Factors" Quantile Asset Pricing Model interpretation: we apply our model and estimator with a large-dimensional panel of financial data and find that (i) characteristics have sparser predictive power once latent factors were controlled (ii) the factors and coefficients at upper and lower quantiles are different from the median.
About Oscar Madrid Padilla
Oscar Madrid Padilla is a Tenure-track Assistant Professor in the Department of Statistics at University of California, Los Angeles. Previously, from July, 2017 to June, 2019, he was Neyman Visiting Assistant Professor in the Department of Statistics at University of California, Berkeley. Before that, he earned a Ph.D. in statistics at The University of Texas at Austin in 2017 under the supervision of Prof. James Scott. His undergraduate degree was a B.S in Mathematics completed at CIMAT (in Mexico) in 2013. His research interests include: High-dimensional statistics, nonparametric statistics, change point detection, causal inference quantile regression, Bayesian and empirical bayes methodology, and Graphical models.