We give a report and some generalizations of two methods of data analy
sis which where recently developed. The methods are based on entropies
or entropy-like quantities called 'generalized mutual information' an
d 'entropy profile'. They measure, each in its specific way, statistic
al dependences in a time series or in a pattern. The quantities are in
variant with respect to non-linear distortions of the data. All method
s are, in principle, applicable to continuous stationary time series,
or, more generally, to any continuous random vector. However, in pract
ice also discrete processes can be investigated, provided the random v
ariables attain a multitude of different values. For each method we ha
ve obvious numerical algorithms.