作者: Kurt S. Riedel
DOI:
关键词:
摘要: 1 Introduction.- 2 Probabilistic Inference in Signal Processing.- 2.1 2.2 The likelihood function.- 2.2.1 Maximum likelihood.- 2.3 Bayesian data analysis.- 2.4 Prior probabilities.- 2.4.1 Flat priors.- 2.4.2 Smoothness 2.4.3 Convenience 2.5 removal of nuisance parameters.- 2.6 Model selection using evidence.- 2.6.1 Ockham's razor.- 2.7 general linear model.- 2.8 Interpretations the 2.8.1 Features.- 2.8.2 Orthogonalization.- 2.9 Example marginalization.- 2.9.1 Results.- 2.10 model selection.- 2.10.1 Closed form expression for 2.10.2 Determining order a polynomial.- 2.10.3 an AR process.- 2.11 Concluding remarks.- 3 Numerical Inference.- 3.1 normal approximation.- 3.1.1 Effect number on 3.1.2 Taylor 3.1.3 Reparameterization.- 3.1.4 Jacobian transformation.- 3.1.5 Normal approximation to 3.1.6 marginal density.- 3.1.7 delta method.- 3.2 Optimization.- 3.2.1 Local algorithms.- 3.2.2 Global 3.2.3 3.3 Integration.- 3.4 quadrature.- 3.4.1 Multiple integrals.- 3.5 Asymptotic approximations.- 3.5.1 saddlepoint and Edgeworth series.- 3.5.2 Laplace 3.5.3 Moments expectations.- 3.5.4 Marginalization.- 3.6 Monte Carlo 3.7 generation random variates.- 3.7.1 Uniform 3.7.2 Non-uniform 3.7.3 Transformation variables.- 3.7.4 rejection 3.7.5 Other methods.- 3.8 Evidence importance sampling.- 3.8.1 Choice sampling 3.8.2 Orthogonalization noise colouring.- 3.9 Marginal densities.- 3.9.1 Histograms.- 3.9.2 Jointly distributed 3.9.3 dummy variable 3.9.4 Marginalization jointly 3.10 Opportunities variance reduction.- 3.10.1 Quasi-random sequences.- 3.10.2 Antithetic 3.10.3 Control 3.10.4 Stratified 3.11 Summary.- 4 Markov Chain Methods.- 4.1 4.2 Background chains.- 4.3 canonical distribution.- 4.3.1 Energy, temperature probability.- 4.3.2 Random walks.- 4.3.3 Free energy 4.4 Gibbs sampler.- 4.4.1 Description.- 4.4.2 Discussion.- 4.4.3 Convergence.- 4.5 Metropolis-Hastings algorithm.- 4.5.1 4.5.2 4.5.3 Choosing proposal 4.5.4 Relationship between Metropolis.- 4.6 Dynamical 4.6.1 Derivation.- 4.6.2 Hamiltonian dynamics.- 4.6.3 Stochastic transitions.- 4.6.4 Simulating 4.6.5 Hybrid Carlo.- 4.6.6 Convergence 4.7 Implementation simulated annealing.- 4.7.1 Annealing schedules.- 4.7.2 with 4.8 issues.- 4.8.1 Assessing convergence 4.8.2 estimates.- 4.9 estimation.- 4.9.1 Thermodynamic integration.- 4.9.2 4.10 5 Retrospective Changepoint Detection.- 5.1 5.2 simple step detector.- 5.2.1 Derivation 5.2.2 Application 5.3 detection changepoints 5.3.1 piecewise 5.3.2 Simple detector generalized matrix form.- 5.3.3 models.- 5.3.4 changepoint 5.4 Recursive 5.4.1 Update position.- 5.4.2 given more data.- 5.5 Detection multiple changepoints.- 5.6 details.- 5.6.1 Sampling space.- 5.6.2 parameter 5.6.3 5.7 results.- 5.7.1 Synthetic 5.7.2 Well log 5.8 Remarks.- 6 Restoration Missing Samples Digital Audio Signals.- 6.1 6.2 formulation.- 6.2.1 excitation energy.- 6.2.2 6.3 EM 6.3.1 Expectation.- 6.3.2 Maximization.- 6.4 6.4.1 6.4.2 conditional 6.4.3 Conditional density missing 6.4.4 autoregressive 6.4.5 standard deviation.- 6.5 6.5.1 Estimating 6.5.2 Implementing ML 6.5.3 6.5.4 6.6 three restoration 6.6.1 vs Gibbs.- 6.6.2 EM.- 6.6.3 ML.- 6.7 Simulations.- 6.7.1 Autoregressive poles near unit circle.- 6.7.2 origin.- 6.7.3 Sine wave.- 6.7.4 Evolution sample interpolants.- 6.7.5 Hairy sine 6.7.6 Real data: Tuba.- 6.7.7 Sinead O'Connor.- 6.8 6.8.1 interpolant.- 6.8.2 Data augmentation.- 6.9 6.9.1 Typical 6.9.2 Computation.- 6.9.3 Modelling 7 Integration Analysis.- 7.1 Polynomial 7.1.1 7.1.2 joint 7.1.3 Approximate 7.1.4 7.1.5 Conclusion.- 7.2 Decay problem.- 7.2.1 Lanczos 7.2.2 Biomedical 7.2.3 7.3 General 7.3.1 impulsive environment.- 7.3.2 Gaussian 7.4 8 8.1 A review work.- 8.2 Further Linear Model.- A.1 Integrating out amplitudes.- A.1.1 Least squares.- A.1.2 A.2 A.3 coefficient.- A.4 A.5 A.6 B from Multivariate Density.- C Derivations.- C.1 Full C.2 Student-t C.3 Remark.- D Algorithm D.l D.2 E Issues Based Approaches E.1 Marginalizing E.2 Approximating E.3 E.4 Reverse F Detailed Balance.- F.1 balance F.2 Metropolis Hastings algorithm..- F.3 F.4 References.