| MULTIVARIATE  BAYESIAN  STATISTICS  Models for  Source  Separation  and Signal  Unmixing Daniel B. Rowe
 
 Contents List of Figures
 List of Tables
 Introduction
 1.1 The Cocktail Party
 1.2 The Source Separation Model
 I Fundamentals
 Statistical Distributions
 2.1 Scalar Distributions
 2.1.1 Binomial
 2.1.2 Beta
 2.1.3 Normal
 2.1.4 Gamma and Scalar Wishart
 2.1.5 Inverted Gamma and Scalar Inverted Wishart
 2.1.6 Student t
 2.1.7 F-Distribution
 2.2 Vector Distributions
 2.2.1 Multivariate Normal
 2.2.2 Multivariate Student t
 2.3 Matrix Distributions
 2.3.1 MatrLx Normal
 2.3.2 Wishart
 2.3.3 Inverted Wishart
 2.3.4 MatrLx T
 Introductory Bayesian Statistics
 3.1 Discrete Scalar Variables
 3.1.1 Bayes' Rule and Two Simple Events
 3.1.2 Bayes' Rule and the Law of Total Probability
 3.2 Continuous Scalar Variables
 3.3 Continuous Vector Variables
 3.4 Continuous Matrix Variables
 2003 by Chapman & Hall/CRC
 Prior Distributions
 4.1 Vague Priors
 4.1.1 Scalar Variates
 4.1.2 Vector Variates
 4.1.3 Mat rLx Variates
 4.2 Conjugate Priors
 4.2.1 Scalar Variates
 4.2.2 Vector Variates
 4.2.3 Mat rLx Variates
 4.3 Generalized Priors
 4.3.1 Scalar Variates
 4.3.2 Vector Variates
 4.3.3 Mat rLx Variates
 4.4 Correlation Priors
 4.4.1 Intraclass
 4.4.2 Markov
 Hyperparameter Assessment
 5.1 Introduction
 5.2 Binomial Likelihood
 5.2.1 Scalar Beta
 5.3 Scalar Normal Likelihood
 5.3.1 Scalar Normal
 5.3.2 Inverted Gamma or Scalar Inverted Wishart
 5.4 Multivariate Normal Likelihood
 5.4.1 Multivariate Normal
 5.4.2 Inverted Wishart
 5.5 Matrix Normal Likelihood
 5.5.1 Mat rL, c Normal
 5.5.2 Inverted Wishart
 Bayesian Estimation Methods
 6.1 Marginal Posterior Mean
 6.1.1 Mat rL, c Integration
 6.1.2 Gibbs Sampling
 6.1.3 Gibbs Sampling Convergence
 6.1.4 Normal Variate Generation
 6.1.5 Wishart and Inverted Wishart Variate Generation
 6.1.6 Factorization
 6.1.7 Rejection Sampling
 6.2 Maximum a Posteriori
 6.2.1 Mat rL, c Differentiation
 6.2.2 Iterated Conditional Modes (ICM)
 6.3 Advantages of ICM over Gibbs Sampling
 6.4 Advantages of Gibbs Sampling over ICM
 2003 by Chapman & Hall/CRC
 II
 Regression
 7.1 Introduction
 7.2 Normal Samples
 7.3 Simple Linear Regression
 7.4 Multiple Linear Regression
 7.5 Multivariate Linear Regression
 Models
 Bayesian Regression
 8.1 Introduction
 8.2 The Bayesian Regression Model
 8.3 Likelihood
 8.4 Conjugate Priors and Posterior
 8.5 Conjugate Estimation and Inference
 8.5.1 Marginalization
 8.5.2 Maximum a Posteriori
 8.6 Generalized Priors and Posterior
 8.7 Generalized Estimation and Inference
 8.7.1 Marginalization
 8.7.2 Posterior Conditionals
 8.7.3 Gibbs Sampling
 8.7.4 Maximum a Posteriori
 8.8 Interpretation
 8.9 Discussion
 Bayesian Factor Analysis
 9.1 Introduction
 9.2 The Bayesian Factor Analysis Model
 9.3 Likelihood
 9.4 Conjugate Priors and Posterior
 9.5 Conjugate Estimation and Inference
 9.5.1 Posterior Conditionals
 9.5.2 Gibbs Sampling
 9.5.3 Maximum a Posteriori
 9.6 Generalized Priors and Posterior
 9.7 Generalized Estimation and Inference
 9.7.1 Posterior Conditionals
 9.7.2 Gibbs Sampling
 9.7.3 Maximum a Posteriori
 9.8 Interpretation
 9.9 Discussion
 2003 by Chapman & Hall/CRC
 10 Bayesian Source Separation
 10.1 Introduction
 10.2 Source Separation Model
 10.3 Source Separation Likelihood
 10.4 Conjugate Priors and Posterior
 10.5 Conjugate Estimation and Inference
 10.5.1 Posterior Conditionals
 10.5.2 Gibbs Sampling
 10.5.3 Maximum a Posteriori
 10.6 Generalized Priors and Posterior
 10.7 Generalized Estimation and Inference
 10.7.1 Posterior Conditionals
 10.7.2 Gibbs Sampling
 10.7.3 Maximum a Posteriori
 10.8 Interpretation
 10.9 Discussion
 11 Unobservable and Observable Source Separation
 11.1 Introduction
 11.2 Model
 11.3 Likelihood
 11.4 Conjugate Priors and Posterior
 11.5 Conjugate Estimation and Inference
 11.5.1 Posterior Conditionals
 11.5.2 Gibbs Sampling
 11.5.3 Maximum a PosterJori
 11.6 Generalized Priors and Posterior
 11.7 Generalized Estimation and Inference
 11.7.1 Posterior Conditionals
 11.7.2 Gibbs Sampling
 11.7.3 Maximum a PosterJori
 11.8 Interpretation
 11.9 Discussion
 12 FMRI Case Study
 12.1 Introduction
 12.2 Model
 12.3 Priors and Posterior
 12.4 Estimation and Inference
 12.5 Simulated FMRI Experiment
 12.6 Real FMRI Experiment
 12.7 FMRI Conclusion
 2003 by Chapman & Hall/CRC
 III Generalizations
 13
 Delayed Sources and Dynamic Coefficients
 13.1 Introduction
 13.2 Model
 13.3 Delayed Constant Mixing
 13.4 Delayed Nonconstant Mixing
 13.5 Instantaneous Nonconstant Mixing
 13.6 Likelihood
 13.7 Conjugate Priors and Posterior
 13.8 Conjugate Estimation and Inference
 13.8.1 Posterior Conditionals
 13.8.2 Gibbs Sampling
 13.8.3 Maximum a Posteriori
 13.9 Generalized Priors and Posterior
 13.10 Generalized Estimation and Inference
 13.10.1 Posterior Conditionals
 13.10.2 Gibbs Sampling
 13.10.3 Maximum a Posteriori
 13.11 Interpretation
 13.12 Discussion
 14
 Correlated Observation and Source Vectors
 14.1 Introduction
 14.2 Model
 14.3 Likelihood
 14.4 Conjugate Priors and Posterior
 14.5 Conjugate Estimation and Inference
 14.5.1 Posterior Conditionals
 14.5.2 Gibbs Sampling
 14.5.3 Maximum a Posteriori
 14.6 Generalized Priors and Posterior
 14.7 Generalized Estimation and Inference
 14.7.1 Posterior Conditionals
 14.7.2 Gibbs Sampling
 14.7.3 Maximum a Posteriori
 14.8 Interpretation
 14.9 Discussion
 15 Conclusion
 Appendix A FMRI Activation Determination
 A.1 Regression
 A.2 Gibbs Sampling
 A.3 ICM
 Appendix B FMRI Hyperparameter Assessment
 Bibliography
 2003 by Chapman & Hall/CRC
 
 |