The EOS prototype validation exercise (PROVE) at Jornada: Overview and lessons learned

Citation
Jl. Privette et al., The EOS prototype validation exercise (PROVE) at Jornada: Overview and lessons learned, REMOT SEN E, 74(1), 2000, pp. 1-12
Citations number
25
Categorie Soggetti
Earth Sciences
Journal title
REMOTE SENSING OF ENVIRONMENT
ISSN journal
00344257 → ACNP
Volume
74
Issue
1
Year of publication
2000
Pages
1 - 12
Database
ISI
SICI code
0034-4257(200010)74:1<1:TEPVE(>2.0.ZU;2-6
Abstract
Tho forth Observing System (EOS) instrument teams must validate the operati onal products they produce from the Terra spacecraft data. As a pilot for f uture validation activities, four EOS teams (MODIS, MISR, ASTER, and Landsa t-7) and community experts conducted an Il-dar field campaign in May 1997 n ear Las Cruces, NM. The goals of the Prototype Validation Exercise (PROVE) included (1) gaining experience in the collection and use of field data for EOS product validation; (2) developing coordination, measurement, and data -archiving protocols; and (3) compiling a synoptic land and atmospheric dat a set for testing algorithms. PROVE was held at the USDA-Agricultural Resea rch Service's (ARS) Jornada Experimental Range, an expansive desert plateau hosting a complex mosaic of grasses and shrubs. Most macroscopic variables affecting the radiation environment were measured with ground air-borne (i ncluding AVIRIS and laser altimeter), and space-borne sensors (including AV HRR, Landsat TM SPOT, POLDER, and GOES). The Oak Ridge Distributed Active A rchive Center (DAAC) then used campaign data sets to prototype Mercury, its Internet-based data harvesting and distribution system. This article provi des general information about PROVE and assesses the progress made toward t he campaign goat. Primary successes included the rapid campaign formulation and execution, measurement protocol development, and the significant colle ction, reduction, and sharing of data among participants. However, the PROV E data were used primarily for arid-la nd research and model validation rat her than for validating satellite products, and the data were slow; to reac h the DAAC and hence public domain. The lessons learned included: (1) valid ation campaigns can be rapidly organized and implemented if there are focus ed objectives and on-site facilities and expertise; (2) data needs, organiz ation, storage, and access issues must be addressed at the onset of campaig n planning; and (3) the end-to-end data collection, release, and publicatio n environment may need to be readdressed by program managers, funding agenc ies, and journal editors if rapid and comprehensive validation of operation al satellite products is to occur Published by Elsevier Science Inc.