Coping with ever larger problems, models, and data bases

Authors
Citation
Mb. Beck, Coping with ever larger problems, models, and data bases, WATER SCI T, 39(4), 1999, pp. 1-11
Citations number
49
Categorie Soggetti
Environment/Ecology
Journal title
WATER SCIENCE AND TECHNOLOGY
ISSN journal
02731223 → ACNP
Volume
39
Issue
4
Year of publication
1999
Pages
1 - 11
Database
ISI
SICI code
0273-1223(1999)39:4<1:CWELPM>2.0.ZU;2-#
Abstract
Those who construct models, including models of the quality of the aquatic environment, are driven largely by the search for (theoretical) completenes s in the products of their efforts. For if we know of something of potentia l relevance, and computational power is increasing, why should that somethi ng be left out? Those who use the results of such models are probably reass ured by this imprimatur, of having supposedly based their decisions on the best available scientific evidence. Our models, and certainly those we woul d label "state-of-the-art", seem destined always to get larger. Some observ ations an possible strategies for coping with this largeness, while yet mak ing well reasoned and adequately buttressed decisions on how to manage the water environment, are the subject of this paper. Because it is so obvious, and because it has been the foundation of analytical enquiry for such a ve ry long time, our point of departure is the classical procedure of disassem bling the whole into its parts with subsequent re-assembly of the resulting part solutions into an overall solution. This continues to serve us well, at least in terms of pragmatic decision-making, but perhaps not in terms of reconciling the model with the field observations, i.e., in terms of model calibration. If the indivisible whole is to be addressed, and it is large, contemporary studies show that we shall have to shed an attachment to loca ting the single, best decision and be satisfied instead with having identif ied a multiplicity of acceptably good possibilities. If, in the face of an inevitable uncertainty, there is then a concern for reassurance regarding t he robustness of a specific course of action (chosen from among the good po ssibilities), significant recent advances in the methods of global (as oppo sed to local) sensitivity analysis are indeed timely. Ultimately, however, no matter how large and seemingly complete the model, whether we trust its output is a very strong function of whether this outcome tallies with our m ental image of the given system's behaviour. The paper argues that largenes s must therefore be pruned through the application of appropriate methods o f model simplification, through procedures aimed directly at this issue of promoting the generation, corroboration, and refutation of high-level conce ptual insights and understanding. The paper closes with a brief discussion of two aspects of the role of field observations in evaluating a (large) mo del: quality assurance of that model in the absence of any data; and the pr eviously somewhat under-estimated challenge of reconciling large models wit h high-volume data sets. (C) 1999 IAWQ Published by Elsevier Science Ltd. A ll rights reserved.