A class of weighted bootstrap techniques, called biased bootstrap or b-boot
strap methods, is introduced. It is motivated by the need to adjust empiric
al methods, such as the 'uniform' bootstrap, in a surgical way to alter som
e of their features while leaving others unchanged. Depending on the nature
of the adjustment, the b-bootstrap can be used to reduce bias, or to reduc
e variance or to render some characteristic equal to a predetermined quanti
ty. Examples of the last application include a b-bootstrap approach to hypo
thesis testing in nonparametric contexts, where the b-bootstrap enables sim
ulation 'under the null hypothesis', even when the hypothesis is false, and
a b-bootstrap competitor to Tibshirani's variance stabilization method. An
example of the bias reduction application is adjustment of Nadaraya-Watson
kernel estimators to make them competitive with local linear smoothing. Ot
her applications include density estimation under constraints, outlier trim
ming, sensitivity analysis, skewness or kurtosis reduction and shrinkage.