The concentrations of air pollutants have, in general, been steadily i
ncreasing during the last three decades. Therefore, there is an increa
sing need to control the air pollution: air pollution should be reduce
d to acceptable levels and/or kept under these levels in order to prev
ent damage to, and even destructions of ecosystems. However, the reduc
tion of air pollution is an expensive process. This means that air pol
lution should be reduced as much as needed, but no more. Mathematical
models can successfully be used both to study air pollution and to sea
rch for optimal ways of reducing it to some acceptable levels. The dam
aging effects are normally due to the combined effects of several air
pollutants. Therefore, a reliable mathematical model must study simult
aneously all relevant air pollutants. This requirement increases the s
ize of the air pollution models. The discretization of models that con
tain many air pollutants leads to huge computational problems. After t
he discretization of such an air pollution model, systems of several h
undred thousands (or even several millions) of equations arise, and th
ese have to be treated numerically during many time-steps (typically s
everal thousands). Such big computational problems can successfully be
treated only on big modern vector and/or parallel computers. However,
access to a big high-speed computer is not sufficient. One should als
o ensure that the great potential power of the computer is correctly e
xploited to obtain really fast computations. Very often this is a rath
er difficult task. The efforts to solve this task in the case where th
e computer under consideration is the Connection Machine (CM-200) will
be discussed in this paper.