We formulate tempo tracking in a Bayesian framework where a tempo tracker i
s modeled as a stochastic dynamical system. The tempo is modeled as a hidde
n state variable of the system and is estimated by a Kalman filter. The Kal
man filter operates on a Tempogram, a wavelet-like multiscale expansion of
a real performance. An important advantage of our approach is that it is po
ssible to formulate both offline or real-time algorithms. The simulation re
sults on a systematically collected set of MIDI piano performances of Yeste
rday and Michelle by the Beatles shows accurate tracking of approximately 9
0% of the beats.