I Temporal ICA Models.- 1 Hidden Markov Independent Component Analysis.- 1.1 Introduction.- 1.2 Hidden Markov Models.- 1.3 Independent Component Analysis.- 1.3.1 Generalised Exponential Sources.- 1.3.2 Generalised Autoregressive Sources.- 1.4 Hidden Markov ICA.- 1.4.1 Generalised Exponential Sources.- 1.4.2 Generalised Autoregressive Sources.- 1.5 Practical Issues.- 1.5.1 Initialisation.- 1.5.2 Learning.- 1.5.3 Model Order Selection.- 1.6 Results.- 1.6.1 Multiple Sinewave Sources.- 1.6.2 Same Sources, Different Mixing.- 1.6.3 Same Mixing, Different Sources.- 1.6.4 EEG Data.- 1.7 Conclusion.- 1.8 Acknowledgements.- 1.9 Appendix.- 2 Particle Filters for Non-Stationary ICA.- 2.1 Introduction.- 2.2 Stationary ICA.- 2.3 Non-Stationary Independent Component Analysis.- 2.3.1 Source Model.- 2.4 Particle Filters.- 2.4.1 Source Recovery.- 2.5 Illustration of Non-Stationary ICA.- 2.6 Smoothing.- 2.7 Temporal Correlations.- 2.8 Conclusion.- 2.8.1 Acknowledgement.- 2.9 Appendix: Laplace's Approximation for the Likelihood.- II The Validity of the Independence Assumption.- 3 The Independence Assumption: Analyzing the Independence of the Components by Topography.- 3.1 Introduction.- 3.2 Background: Independent Subspace Analysis.- 3.3 Topographic ICA Model.- 3.3.1 Dependence and Topography.- 3.3.2 Defining Topographic ICA.- 3.3.3 The Generative Model.- 3.3.4 Basic Properties of the Topographic ICA Model.- 3.4 Learning Rule.- 3.5 Comparison with Other Topographic Mappings.- 3.6 Experiments.- 3.6.1 Experiments in Feature Extraction of Image Data.- 3.6.2 Experiments in Feature Extraction of Audio Data.- 3.6.3 Experiments with Magnetoencephalographic Recordings.- 3.7 Conclusion.- 4 The Independence Assumption: Dependent Component Analysis.- 4.1 Introduction.- 4.2 Blind Source Separation by DCA.- 4.3 The "Cyclone" Algorithm.- 4.4 Experimental Results.- 4.5 Higher-Order Cyclostationary Signal Separation.- 4.6 Conclusion.- 4.7 Appendix: Proof of ACF Property 3.- III Ensemble Learning and Applications.- 5 Ensemble Learning.- 5.1 Introduction.- 5.2 Posterior Averages in Action.- 5.3 Approximations of Posterior PDF.- 5.4 Ensemble Learning.- 5.4.1 Model Selection in Ensemble Learning.- 5.4.2 Connection to Coding.- 5.4.3 EM and MAP.- 5.5 Construction of Probabilistic Models.- 5.5.1 Priors and Hyperpriors.- 5.6 Examples.- 5.6.1 Fixed Form Q.- 5.6.2 Free Form Q.- 5.7 Conclusion.- References.- 6 Bayesian Non-Linear Independent Component Analysis by Multi-Layer Perceptrons.- 6.1 Introduction.- 6.2 Choosing Among Competing Explanations.- 6.3 Non-Linear Factor Analysis.- 6.3.1 Definition of the Model.- 6.3.2 Cost Function.- 6.3.3 Update Rules.- 6.4 Non-Linear Independent Factor Analysis.- 6.5 Experiments.- 6.5.1 Learning Scheme.- 6.5.2 Helix.- 6.5.3 Non-Linear Artificial Data.- 6.5.4 Process Data.- 6.6 Comparison with Existing Methods.- 6.6.1 SOM and GTM.- 6.6.2 Auto-Associative MLPs.- 6.6.3 Generative Learning with MLPs.- 6.7 Conclusion.- 6.7.1 Validity of the Approximations.- 6.7.2 Initial Inversion by Auxiliary MLP.- 6.7.3 Future Directions.- 6.8 Acknowledgements.- 7 Ensemble Learning for Blind Image Separation and Deconvolution.- 7.1 Introduction.- 7.2 Separation of Images.- 7.2.1 Learning the Ensemble.- 7.2.2 Learning the Model.- 7.2.3 Example.- 7.2.4 Parts-Based Image Decomposition.- 7.3 Deconvolution of Images.- 7.4 Conclusion.- 7.5 Acknowledgements.- References.- IV Data Analysis and Applications.- 8 Multi-Class Independent Component Analysis (MUCICA) for Rank-Deficient Distributions.- 8.1 Introduction.- 8.2 The Rank-Deficient One Class Problem.- 8.2.1 Method I: Three Blocks.- 8.2.2 Method II: Two Blocks.- 8.2.3 Method III: One Block.- 8.3 The Rank-Deficient Multi-Class Problem.- 8.4 Simulations.- 8.5 Conclusion.- References.- 9 Blind Separation of Noisy Image Mixtures.- 9.1 Introduction.- 9.2 The Likelihood.- 9.3 Estimation of Sources for the Case of Known Parameters.- 9.4 Joint Estimation of Sources, Mixing Matrix, and Noise Level.- 9.5 Simulation Ex
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.