A best orthogonal basis for a vector is selected from a library to minimize
a cost function of the expansion coefficients. How it depends on the cost
function and under what conditions it provides the fastest nonlinear approx
imation are still open questions which we partially answer in this paper. S
quared expansion coefficients may be considered a discrete probability dens
ity function, or pdf. We apply some inequalities for pdfs to obtain three p
ositive results and two counterexamples. We use the notion of subexponentia
lity, derived from the classical proof of an entropy inequality, to derive
a number of curious inequalities relating different information costs of a
single pdf. We then generalize slightly the classical result that one pdf m
ajorizes another if it is cheaper with respect to a large-enough set of inf
ormation cost functions. Finally, we present inequalities that bracket any
information cost for a pdf between two functions of norms of the pdf, plus
a counterexample showing that our result has a certain optimality. Another
counterexample shows that, unfortunately, the set of norm-type pdfs is not
large enough to imply majorization. We conclude that all information cost f
unctions are weakly comparable to norms, but this is not quite enough to gu
arantee in general that the cheapest-norm pdf majorizes. (C) 2001 Academic
Press.