Bootstrapping is a useful technique when you need to make inferences about "unusual" statistics: the difference in the average correlation of a set of variables between two groups, for example. And as a programming language, R (in conjunction with the boot package) makes bootstrapping such unusual quantities easy. Jeromy Anglin explains:

R is very cool for bootstrapping. I’ve mainly used the boot package and found it very good. In fact, it is a classic example of something that R makes easy. It's easy to run loops in R, and R is excellent at taking output from one function and using it as input to another. This is the essence of bootstrapping: taking different samples of your data, getting a statistic for each sample (e.g., the mean, median, correlation, regression coefficient, etc.), and using the variability in the statistic across samples to indicate something about the standard error and confidence intervals for the statistic.

Jeromy also points to this nice introductory tutorial on bootstrapping in R from Ajay Shah.

Jeromy Anglim's Blog: Bootstrapping and the boot package in R

"Bootstrapping is a useful technique when you need to make inferences about "unusual" statistics"

Or even "usual" statistics - mean, median, etc. Turns out, bootstrapping is more accurate when done with some of the bias control and acceleration techniques, and, if you believe the Stanford school, is approximately equivalent to Bayesian inference in many cases.

Posted by: John Johnson | September 15, 2009 at 12:33