THESIS
2002
viii, 45 leaves : ill. ; 30 cm
Abstract
Covariance Selection Models are useful in multivariate data analysis. They reduce the number of parameters in the inverse covariance matrix for Gaussian data by setting some entries to zero. Decomposable covariance selection models are special cases of covariance selection models. Their properties allow factorization of probability density for the covariance matrix called the Hyper Inverse Wishart (HIW) distribution. Giudici (1996) uses a Bayesian model and expressions for the marginal likelihood to calculate the posterior probability of the decomposable graphs. Giudici and Green (1999) give a Markov chain Monte Carlo (MCMC) approach for decomposable models that generates the covariance matrix. This thesis considers similar Bayesian models to Giudici (1996) arid Giudici and Green (1999)...[
Read more ]
Covariance Selection Models are useful in multivariate data analysis. They reduce the number of parameters in the inverse covariance matrix for Gaussian data by setting some entries to zero. Decomposable covariance selection models are special cases of covariance selection models. Their properties allow factorization of probability density for the covariance matrix called the Hyper Inverse Wishart (HIW) distribution. Giudici (1996) uses a Bayesian model and expressions for the marginal likelihood to calculate the posterior probability of the decomposable graphs. Giudici and Green (1999) give a Markov chain Monte Carlo (MCMC) approach for decomposable models that generates the covariance matrix. This thesis considers similar Bayesian models to Giudici (1996) arid Giudici and Green (1999), where the corresponding graphs are decomposable. We implement an efficient sampler by integrating the covariance matrix out of all conditional distributions. We show that the reduced conditional sampler is efficient, in that it converges quickly and it has low autocorrelation function for the generated estimates of the inverse covariance matrix.
Post a Comment