0% found this document useful (0 votes)
2 views4 pages

Boots Trapping

The document introduces the bootstrap method, a resampling technique for estimating the sampling distribution of an estimator by resampling with replacement from original data. It outlines the basic bootstrap algorithm, methods for estimating standard error and constructing confidence intervals, and discusses the application of bootstrap to Maximum Likelihood Estimators (MLEs). Additionally, it provides examples and remarks on considerations for non-i.i.d. data and computational efficiency.

Uploaded by

lautarogonz96
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views4 pages

Boots Trapping

The document introduces the bootstrap method, a resampling technique for estimating the sampling distribution of an estimator by resampling with replacement from original data. It outlines the basic bootstrap algorithm, methods for estimating standard error and constructing confidence intervals, and discusses the application of bootstrap to Maximum Likelihood Estimators (MLEs). Additionally, it provides examples and remarks on considerations for non-i.i.d. data and computational efficiency.

Uploaded by

lautarogonz96
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

An Introduction to the Bootstrap Method

1 Introduction
The bootstrap is a resampling technique introduced by Bradley Efron in 1979. It allows statisticians to estimate
the sampling distribution of an estimator by resampling with replacement from the original data. This method is
particularly useful when the theoretical distribution of the estimator is complex or unknown.

2 The Basic Bootstrap Algorithm


Given a dataset X = {x1 , x2 , . . . , xn } drawn independently from an unknown distribution F , and an estimator
θ̂ = s(X) (such as the sample mean or median), the bootstrap procedure involves the following steps:
1. Resampling:
• Generate B bootstrap samples X ∗1 , X ∗2 , . . . , X ∗B .
• Each bootstrap sample X ∗b is obtained by sampling n observations with replacement from the original
data X.
2. Computing Bootstrap Replicates:
• For each bootstrap sample X ∗b , compute the bootstrap replicate θ̂∗b = s(X ∗b ).
3. Estimating the Sampling Distribution:
• Use the distribution of the bootstrap replicates {θ̂∗1 , θ̂∗2 , . . . , θ̂∗B } to approximate the sampling distribu-
tion of θ̂.

3 Bootstrap Estimate of Standard Error


One primary use of the bootstrap is to estimate the standard error of an estimator θ̂. The bootstrap estimate of the
standard error is calculated as:
v
u B 
u 1 X ¯ 2

σ̂boot = t θ̂∗b − θ̂∗
B−1
b=1

¯ 1 PB ∗b
where θ̂∗ = θ̂ is the mean of the bootstrap replicates.
B b=1

4 Bootstrap Confidence Intervals


Bootstrap methods can construct confidence intervals for the unknown parameter θ using the bootstrap distribution
of θ̂. Here are some common approaches:

4.1 Normal Approximation Interval


Assuming that the bootstrap distribution is approximately normal, the (1 − α) × 100% confidence interval is:
h i
θ̂ − z1−α/2 σ̂boot , θ̂ + z1−α/2 σ̂boot
where z1−α/2 is the (1 − α/2) quantile of the standard normal distribution.

1
4.2 Percentile Interval
The percentile method uses the empirical quantiles of the bootstrap replicates:
h i
∗ ∗
θ̂(α/2) , θ̂(1−α/2)

where θ̂(q) is the q-th percentile of the bootstrap replicates.

5 Asymptotic Properties
Under regularity conditions, bootstrap methods have desirable asymptotic properties:

• Consistency: As the sample size n → ∞, the bootstrap distribution of θ̂∗ converges in probability to the true
sampling distribution of θ̂.

• Second-Order Accuracy: Bootstrap confidence intervals often achieve higher-order accuracy compared to
traditional methods, especially in small samples.

However, the bootstrap relies on the assumption that the sample is representative of the population and that
observations are independent and identically distributed (i.i.d.).

6 Bootstrap Confidence Intervals for Maximum Likelihood Estimators


Maximum Likelihood Estimators (MLEs) are widely used for parameter estimation due to their desirable properties
such as consistency and asymptotic normality. However, deriving the exact sampling distribution of an MLE can
be complex, especially in finite samples or for complicated models. Bootstrapping provides a practical approach to
estimate the sampling distribution and construct confidence intervals for MLEs.

6.1 Applying Bootstrap to MLEs


Given a dataset X = {x1 , x2 , . . . , xn } and a parametric model with likelihood function L(θ; X), the MLE of the
parameter θ is obtained by maximizing the likelihood:

θ̂MLE = arg max L(θ; X)


θ

To use bootstrapping for constructing confidence intervals for θ:

1. Compute the MLE θ̂MLE from the original data.


2. Generate Bootstrap Samples:

• Create B bootstrap samples X ∗1 , X ∗2 , . . . , X ∗B by resampling with replacement from X.


3. Compute Bootstrap Replicates:
• For each bootstrap sample X ∗b , compute the bootstrap MLE θ̂MLE
∗b
by maximizing L(θ; X ∗b ).
4. Estimate the Sampling Distribution:

• Use the distribution of {θ̂MLE


∗1 ∗2
, θ̂MLE ∗B
, . . . , θ̂MLE } to approximate the sampling distribution of θ̂MLE .
5. Construct Confidence Intervals:
• Use methods such as the percentile or normal approximation method to construct confidence intervals for
θ.

2
6.2 Example: Estimating the Mean of a Normal Distribution
Suppose we have a sample X = {x1 , x2 , . . . , xn } from a normal distribution N (µ, σ 2 ) with unknown mean µ and
known variance σ 2 .
1. Compute the MLE:

n
1X
µ̂MLE = xi
n i=1

2. Generate Bootstrap Samples:


• Resample n observations with replacement from X to create B bootstrap samples X ∗1 , X ∗2 , . . . , X ∗B .
3. Compute Bootstrap MLEs:
• For each bootstrap sample X ∗b , compute:
n
1 X ∗b
µ̂∗b
MLE = x
n i=1 i

where x∗b ∗b
i are the observations in X .

4. Construct Confidence Interval:


• Use the percentile method to find the (α/2) and (1 − α/2) quantiles of the bootstrap MLEs µ̂∗b
MLE .
h i
CIboot = µ̂∗(α/2) , µ̂∗(1−α/2)

6.3 Example: Estimating the Rate Parameter of an Exponential Distribution


Suppose X = {x1 , x2 , . . . , xn } is a sample from an exponential distribution with density f (x; λ) = λe−λx for x ≥ 0,
where λ > 0 is the rate parameter.
1. Compute the MLE:

n
λ̂MLE = Pn
i=1 xi
2. Generate Bootstrap Samples:
• Resample n observations with replacement from X to create B bootstrap samples.
3. Compute Bootstrap MLEs:
• For each bootstrap sample X ∗b , compute:
n
λ̂∗b
MLE = Pn
i=1 x∗b
i
4. Construct Confidence Interval:
• Use the bootstrap replicates λ̂∗b
MLE to construct a confidence interval for λ using the desired method (e.g.,
percentile method).

6.4 Remarks
• Non-i.i.d. Data: For MLEs involving non-i.i.d. data or complex models (e.g., time series, clustered data),
adjustments to the bootstrap procedure may be necessary, such as block bootstrapping.
• Computational Considerations: Computing MLEs for each bootstrap sample can be computationally in-
tensive, especially for large datasets or complex models. Efficient algorithms or parallel computing can mitigate
this issue.
• Bias Correction: MLEs can be biased, particularly in small samples. Bootstrapping can help assess and
correct for this bias.

3
7 Example
Suppose we have a sample of size n = 10:

X = {2, 4, 5, 6, 7, 8, 9, 10, 12, 14}


We assume that X comes from an exponential distribution with rate parameter λ. We want to estimate λ using
MLE and construct a 95% confidence interval using the bootstrap method.

1. Compute the MLE:

n 10
λ̂MLE = Pn = ≈ 0.1299
i=1 xi 77

2. Generate B = 1000 Bootstrap Samples:

• Resample n = 10 observations with replacement from X to create 1000 bootstrap samples.


3. Compute Bootstrap MLEs:
• For each bootstrap sample X ∗b , compute:

10
λ̂∗b
MLE = P10
i=1 x∗b
i

4. Construct the 95% Confidence Interval:


• Use the percentile method to find the 2.5% and 97.5% quantiles of the bootstrap MLEs λ̂∗b
MLE .
• The confidence interval is:
h i
CIboot = λ̂∗(2.5%) , λ̂∗(97.5%)

References
Efron, B. (1979). Bootstrap methods: Another look at the jackknife. The Annals of Statistics, 7(1), 1-26.

You might also like