This article introduces a new tool for exploratory data analysis and data m
ining called Scale-Sensitive Gated Experts (SSGE) which can partition a com
plex nonlinear regression surface into a set of simpler surfaces (which we
call features). The set of simpler surfaces has the property that each elem
ent of the set can be efficiently modeled by a single feedforward neural ne
twork. The degree to which the regression surface is partitioned is control
led by an external scale parameter. The SSGE consists of a nonlinear gating
network and several competing nonlinear experts. Although SSGE is similar
to the mixture of experts model of Jacobs et al. [10] the mixture of expert
s model gives only one partitioning of the input-output space, and thus a s
ingle set of features, whereas the SSGE gives the user the capability to di
scover families of features. One obtains a new member of the family of feat
ures for each setting of the scale parameter. In this paper, we derive the
Scale-Sensitive Gated Experts and demonstrate its performance on a time ser
ies segmentation problem. The main results are: 1) the scale parameter cont
rols the granularity of the features of the regression surface, 2) similar
features are modeled by the same expert and different kinds of features are
modeled by different experts, and 3) for the time series problem, the SSGE
finds different regimes of behavior, each with a specific and interesting
interpretation.