Observational data analysis is often based on tacit assumptions of ignorabi
lity or randomness. The paper develops a general approach to local sensitiv
ity analysis for selectivity bias, which aims to study the sensitivity of i
nference to small departures from such assumptions. If M is a model assumin
g ignorability, we surround M by a small neighbourhood Ar defined in the se
nse of Kullback-Leibler divergence and then compare the inference for model
s in Ar with that for M. Interpretable bounds for such differences are deve
loped. Applications to missing data and to observational comparisons are di
scussed. Local approximations to sensitivity analysis are model robust and
can be applied to a wide range of statistical problems.