Boots Trapping
Boots Trapping
1 Introduction
The bootstrap is a resampling technique introduced by Bradley Efron in 1979. It allows statisticians to estimate
the sampling distribution of an estimator by resampling with replacement from the original data. This method is
particularly useful when the theoretical distribution of the estimator is complex or unknown.
¯ 1 PB ∗b
where θ̂∗ = θ̂ is the mean of the bootstrap replicates.
B b=1
1
4.2 Percentile Interval
The percentile method uses the empirical quantiles of the bootstrap replicates:
h i
∗ ∗
θ̂(α/2) , θ̂(1−α/2)
∗
where θ̂(q) is the q-th percentile of the bootstrap replicates.
5 Asymptotic Properties
Under regularity conditions, bootstrap methods have desirable asymptotic properties:
• Consistency: As the sample size n → ∞, the bootstrap distribution of θ̂∗ converges in probability to the true
sampling distribution of θ̂.
• Second-Order Accuracy: Bootstrap confidence intervals often achieve higher-order accuracy compared to
traditional methods, especially in small samples.
However, the bootstrap relies on the assumption that the sample is representative of the population and that
observations are independent and identically distributed (i.i.d.).
2
6.2 Example: Estimating the Mean of a Normal Distribution
Suppose we have a sample X = {x1 , x2 , . . . , xn } from a normal distribution N (µ, σ 2 ) with unknown mean µ and
known variance σ 2 .
1. Compute the MLE:
n
1X
µ̂MLE = xi
n i=1
where x∗b ∗b
i are the observations in X .
n
λ̂MLE = Pn
i=1 xi
2. Generate Bootstrap Samples:
• Resample n observations with replacement from X to create B bootstrap samples.
3. Compute Bootstrap MLEs:
• For each bootstrap sample X ∗b , compute:
n
λ̂∗b
MLE = Pn
i=1 x∗b
i
4. Construct Confidence Interval:
• Use the bootstrap replicates λ̂∗b
MLE to construct a confidence interval for λ using the desired method (e.g.,
percentile method).
6.4 Remarks
• Non-i.i.d. Data: For MLEs involving non-i.i.d. data or complex models (e.g., time series, clustered data),
adjustments to the bootstrap procedure may be necessary, such as block bootstrapping.
• Computational Considerations: Computing MLEs for each bootstrap sample can be computationally in-
tensive, especially for large datasets or complex models. Efficient algorithms or parallel computing can mitigate
this issue.
• Bias Correction: MLEs can be biased, particularly in small samples. Bootstrapping can help assess and
correct for this bias.
3
7 Example
Suppose we have a sample of size n = 10:
n 10
λ̂MLE = Pn = ≈ 0.1299
i=1 xi 77
10
λ̂∗b
MLE = P10
i=1 x∗b
i
References
Efron, B. (1979). Bootstrap methods: Another look at the jackknife. The Annals of Statistics, 7(1), 1-26.