Contents |
The growing influence of high-dimensional regression modeling has led to many remarkable advances in Bayesian variable selection and shrinkage estimation. Due to the computational convenience and theoretical relevance, the use of Gaussian scale mixture priors has become standard practice in high-dimensional Bayesian regression settings. The conditional conjugacy of Gaussian scale mixtures enables us to perform posterior inference via Gibbs sampling. However, when the number of regression coefficients is very large, the computational cost of Gibbs sampling becomes prohibitively expensive as the posterior sampling step requires iterative computations of a large inverse matrix. To address such scalability issue, we propose a scalable Bayesian inference procedure using a new representation of Gaussian scale mixture distributions. The greatest merit of the proposed method is that fast posterior sampling is possible via a partially collapsed Gibbs sampling scheme, which does not require the iterative inverse matrix computation. As an illustration, we show some results from simulation studies and real data analysis. |