Dokumentenart: | Artikel | ||||
---|---|---|---|---|---|
Titel eines Journals oder einer Zeitschrift: | Methods in Ecology and Evolution | ||||
Verlag: | Wiley | ||||
Ort der Veröffentlichung: | HOBOKEN | ||||
Band: | 13 | ||||
Nummer des Zeitschriftenheftes oder des Kapitels: | 12 | ||||
Seitenbereich: | S. 2757-2770 | ||||
Datum: | 2022 | ||||
Institutionen: | Biologie und Vorklinische Medizin > Institut für Pflanzenwissenschaften > Arbeitsgruppe Theoretische Ökologie (Prof. Dr. Florian Hartig) | ||||
Identifikationsnummer: |
| ||||
Stichwörter / Keywords: | BIG-DATA; BIOMASS; MISSPECIFICATION; PARAMETERS; TEMPERATE; INFERENCE; CARBON; Bayesian inference; inverse modelling; model calibration; model discrepancy; multiple constraints; predictive uncertainty; structural model error; systematic data bias | ||||
Dewey-Dezimal-Klassifikation: | 500 Naturwissenschaften und Mathematik > 580 Pflanzen (Botanik) | ||||
Status: | Veröffentlicht | ||||
Begutachtet: | Ja, diese Version wurde begutachtet | ||||
An der Universität Regensburg entstanden: | Ja | ||||
Dokumenten-ID: | 57353 |
Zusammenfassung
Calibrating process-based models using multiple constraints often improves the identifiability of model parameters, helps to avoid several errors compensating each other and produces model predictions that are more consistent with underlying processes. However, using multiple constraints can lead to predictions for some variables getting worse. This is particularly common when combining data ...
Zusammenfassung
Calibrating process-based models using multiple constraints often improves the identifiability of model parameters, helps to avoid several errors compensating each other and produces model predictions that are more consistent with underlying processes. However, using multiple constraints can lead to predictions for some variables getting worse. This is particularly common when combining data sources with very different sample sizes. Such unbalanced model-data fusion efforts are becoming increasingly common, for example when combining manual and automated measurements. Here we use a series of simulated virtual data experiments that aim to demonstrate and disentangle the underlying cause of issues that can occur when calibrating models with multiple unbalanced constraints in combination with systematic errors in models and data. We propose a diagnostic tool to help identify whether a calibration is failing due to these factors. We also test the utility of adding terms representing uncertainty in systematic model/data systematic error in calibrations. We show that unbalanced data by itself is not the problem-when fitting simulated data to the 'true' model, we can correctly recover model parameters and the true dynamics of latent variables. However, when there are systematic errors in the model or the data, we cannot recover the correct parameters. Consequently, the modelled dynamics of the low data volume variables departs significantly from the true values. We demonstrate the utility of the diagnostic tool and show that it can also be used to identify the extent of the imbalance before the calibration starts to ignore the more sparse data. Finally, we show that representing uncertainty in model structural errors and data biases in the calibration can greatly improve the model fit to low-volume data, and improve coverage of uncertainty estimates. We conclude that the underlying issue is not one of sample size or information content per se, despite the popularity of ad hoc approaches that focus on 'weighting' datasets to achieve balance. Our results emphasize the importance of considering model structural deficiencies and data systematic biases in the calibration of process-based models.
Metadaten zuletzt geändert: 29 Feb 2024 12:53