We propose a vector quantisation method which does not only provide a compa
ct description of data vectors in terms codebook vectors, but also gives an
explanation of codebook vectors as binary combinations of elementary featu
res. This corresponds to the intuitive notion that, in the real world, patt
erns can be usefully thought of as being constructed by compositions from s
impler features. The model can be understood as a generative model, in whic
h the codebook vector is generated by a hidden binary state vector. The mod
el is non-probabilistic in the sense that it assigns each data vector to a
single codebook vector. We describe exact and approximate learning algorith
ms for learning deterministic feature representations. In contrast to proba
bilistic models, the deterministic approach allows the use of message propa
gation algorithms within the learning scheme. These are compared with stand
ard mean-field/Gibbs sampling learning. We show that Generative Vector Quan
tisation gives a good performance in large scale real world tasks like imag
e compression and handwritten digit analysis with up to 400 data dimensions
.