## Introduction to an Introduction to Approximate Bayesian Computation (ABC)

Last week I re-blogged a post introducing Approximate Bayesian Computation. I thought some of the content was a little foreign, so I wanted to give an intro to the intro. ABC core concept Say we have a process that is controlled by a parameter - say the slope in $latex y = m\cdot x+b$, or… Continue reading Introduction to an Introduction to Approximate Bayesian Computation (ABC)

Statistics

## Bayesian statistics part 4 – R tutorial

It's been quite a while since I updated this tutorial series- better late than never? Introduction First, we determine the distribution our data comes from, or the likelihood, and any prior information is described using the prior distribution. Then we use Markov chain Monte Carlo to explore this posterior. Then we determine if we need… Continue reading Bayesian statistics part 4 – R tutorial

## Introduction to Bayesian Statistics, Part 3

The two most popular Markov chain Monte Carlo sampling algorithms are Gibbs sampling and Metropolis Hastings. These algorithms produce Markov chains. Numbers inside a Markov chain are dependent on only the previous number. In the context of sampling, we check the probability of the proposed value based on only the probability of the current value,… Continue reading Introduction to Bayesian Statistics, Part 3

## Introduction to Bayesian statistics, Part 2

As mentioned in Part 1, in Bayesian statistics you summarize a priori knowledge in the prior, and your data in the likelihood. The prior distribution is often chosen based on analytical convenience, while the likelihood is chosen based on the underlying sampling distribution (read about some appropriate distributions here). Multiplying these together produces the posterior distribution. Probability… Continue reading Introduction to Bayesian statistics, Part 2