Difference between revisions of "AP Statistics Curriculum 2007 Bayesian Gibbs"
m (Text replacement - "{{translate|pageName=http://wiki.stat.ucla.edu/socr/" to ""{{translate|pageName=http://wiki.socr.umich.edu/") |
|||
Line 35: | Line 35: | ||
* SOCR Home page: http://www.socr.ucla.edu | * SOCR Home page: http://www.socr.ucla.edu | ||
− | {{translate|pageName=http://wiki. | + | "{{translate|pageName=http://wiki.socr.umich.edu/index.php?title=AP_Statistics_Curriculum_2007_Bayesian_Gibbs}} |
Latest revision as of 11:41, 3 March 2020
Contents
- 1 Probability and Statistics Ebook - Expectation Maximization Estimation, Gibbs Sampling and Monte Carlo Simulations
- 2 Introduction to numerical methods
- 3 EM algorithm
- 4 Data augmentation by Monte Carlo
- 5 The Gibbs Sampler
- 6 Rejection Sampling
- 7 Metropolis Hastings Algorithm
- 8 Generalized Linear Model
- 9 See also
- 10 References
Probability and Statistics Ebook - Expectation Maximization Estimation, Gibbs Sampling and Monte Carlo Simulations
Gibbs sampling is an algorithm to generate a sequence of samples from the joint probability distribution of two or more random variables. The purpose of this sequence is to approximate the joint distribution, or to compute an expected value. Gibbs sampling is a special case of the Metropolis-Hastings algorithm and is also an example of a Markov chain Monte Carlo algorithm.
Introduction to numerical methods
EM algorithm
Data augmentation by Monte Carlo
The Gibbs Sampler
Rejection Sampling
Metropolis Hastings Algorithm
Generalized Linear Model
See also
References
- Expectation Maximization and Mixture Modeling Tutorial (December 9, 2008). Statistics Online Computational Resource. Paper EM_MM, http://repositories.cdlib.org/socr/EM_MM.
- SOCR Home page: http://www.socr.ucla.edu
"-----
Translate this page: