Grand
Challenges:
Substantial progress has been made in the last several
decades for quantification and communication of hydrologic uncertainty (uncertainty
quantification here is in a broad sense, including parameter estimation,
sensitivity analysis, uncertainty propagation, and experimental data and
data-worth analysis for uncertainty reduction). For example, due to development
of public-domain software (e.g., PEST, UCODE, and DREAM), uncertainty
quantification using regression and Bayesian methods has become a common
practice not only in academic but also in consulting industry and govern
agencies. However, the hydrologic uncertainty community are still facing several
grand challenges that have not been fully resolved, and new challenges emerge
due to changing hydrology and environmental conditions. Below are three grand
challenges that may be addressed in the coming decade:
(1)
Software
development: Software for supporting decision-making
and for communicating uncertainty with decision-makers and stakeholders is
still needed. Given that there have been a number of software in the public
domain, it appears to be necessary to launch an effort for community software
development such as building libraries of uncertainty quantification,
visualization, and communication. An effort is also needed to closely collaborate
with software developers of physical models, so that uncertainty quantification
can be built as a module of the modeling software for efficient and effective
operations.
(2)
Information
and knowledge extraction from data: While new technologies
of data across multiple scales collection are always needed, it is of
tantamount importance to develop methodologies that can extract information and
knowledge from data. This includes identification of new and overlooked data
needs (e.g., water management data such as water use), revisit of existing data
(e.g., those collected by NASA or NOAA but have not been analyzed), and
development of machine learning and deep learning methods suitable to
hydrologic research. Machine learning is a hot topic in many research fields,
and its value to hydrologic uncertainty quantification (especially on reducing
model structure uncertainty) has not been intensively explored. A particular
need
(3)
Computationally
efficient algorithms: Uncertainty quantification nowadays mainly
relies on Monte Carlo approaches, which is computationally expensive
particularly for new models that are more complex than models several decades
ago. Computationally efficient algorithms (e.g., parallel computing and
surrogate modeling) will enable us to conduct more comprehensive and accurate uncertainty
quantification. The effort of algorithm development requires close
collaboration with scientists in other disciplinaries such as applied
mathematics, statistics, and computational science.
We feel that the research field of hydrologic
uncertainty is in its transition stage in two sense. First, substantial
progress has been made in the past but we need to finish the last mile. For
example, we have developed many methods for uncertainty quantification, but
need to work on efficient and effective communication of uncertainty to
decision-makers and stakeholders. In addition, we are facing new challenges of
developing more advanced methodologies to make a full use of existing data and
emerging computational hardware and algorithms.
No comments:
Post a Comment