In this paper, we propose new Bayesian hierarchical representations of lasso, adaptive lasso and elastic net quantile regression models. We explore these representations by observing that the lasso penalty function corresponds to a scale mixture of truncated normal distribution (with exponential mixing densities). We consider fully Bayesian treatments that lead to new Gibbs sampler methods with tractable full conditional posteriors.
View Article and Find Full Text PDF