The University of California, Berkeley, is the primary science analysis cen
ter for the Fast Auroral Snapshot Explorer (FAST). It is responsible for pr
ocessing and routing the science data to the various investigators and serv
es as an on-line data archive and science command center for the project. R
aw data, received from NASA within a few hours of its transmission to the g
round, is processed into CD-ROMs for archive and distribution to the Co-Inv
estigators, and summary data is generated for viewing at the FAST Web site
or for retrieval as key parameter data from NASA or the FAST Web site. Dail
y science command loads and real-time commands are generated to optimize sc
ience data collection. This paper presents a description of the FAST ground
operations performed at Berkeley. Also included is a discussion of the dat
a analysis software and tools that allow the large data volume (>1 terabyte
) to be accessed quickly and efficiently by the scientists. These tasks are
more extensive than those performed in the past by science institutions an
d can be considered as a model for future small programs to reduce cost and
maximize science return.