Numerical ensemble forecasting is a powerful tool that drives many risk analysis efforts and decision making tasks. These ensembles are composed of individual simulations that each uniquely model a possible outcome for a common event of interest: e.g., the direction and force of a hurricane, or the path of travel and mortality rate of a pandemic. This paper presents a new visual strategy to help quantify and characterize a numerical ensemble's predictive uncertainty: i.e., the ability for ensemble constituents to accurately and consistently predict an event of interest based on ground truth observations. Our strategy employs a Bayesian framework to first construct a statistical aggregate from the ensemble. We extend the information obtained from the aggregate with a visualization strategy that characterizes predictive uncertainty at two levels: at a global level, which assesses the ensemble as a whole, as well as a local level, which examines each of the ensemble's constituents. Through this approach, modelers are able to better assess the predictive strengths and weaknesses of the ensemble as a whole, as well as individual models. We apply our method to two datasets to demonstrate its broad applicability.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TVCG.2013.138 | DOI Listing |
Med Phys
January 2025
Department of Radiation Oncology, Stanford University, Palo Alto, California, USA.
Background: Dosimetric commissioning and quality assurance (QA) for linear accelerators (LINACs) present a significant challenge for clinical physicists due to the high measurement workload and stringent precision standards. This challenge is exacerbated for radiosurgery LINACs because of increased measurement uncertainty and more demanding setup accuracy for small-field beams. Optimizing physicists' effort during beam measurements while ensuring the quality of the measured data is crucial for clinical efficiency and patient safety.
View Article and Find Full Text PDFJ Chem Phys
January 2025
Department of Materials Science and Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA.
Generating a dataset that is representative of the accessible configuration space of a molecular system is crucial for the robustness of machine-learned interatomic potentials. However, the complexity of molecular systems, characterized by intricate potential energy surfaces, with numerous local minima and energy barriers, presents a significant challenge. Traditional methods of data generation, such as random sampling or exhaustive exploration, are either intractable or may not capture rare, but highly informative configurations.
View Article and Find Full Text PDFJ Appl Stat
May 2024
Department of Biostatistics, College of Public Health, University of Iowa, Iowa City, IA, USA.
Ischemic stroke is responsible for significant morbidity and mortality in the United States and worldwide. Stroke treatment optimization requires emergency medical personnel to make rapid triage decisions concerning destination hospitals that may differ in their ability to provide highly time-sensitive pharmaceutical and surgical interventions. These decisions are particularly crucial in rural areas, where transport decisions can have a large impact on treatment times - often involving a trade-off between delay in pharmaceutical therapy or a delay in endovascular thrombectomy.
View Article and Find Full Text PDFInt J Thermophys
January 2024
Material Measurement Laboratory, Applied Chemicals and Materials Division, National Institute of Standards and Technology, Boulder, CO 80305, USA.
The thermal conductivity of liquid -1,2-dichloroethene (R-1130(E)) was measured at temperatures ranging from 240 K to 340 K and pressures up to 25 MPa using a transient hot-wire instrument. A total of 447 thermal conductivity data points were measured along six isotherms. Each isotherm includes data at nine pressures, which were chosen to be at equal density increments starting at a pressure of 0.
View Article and Find Full Text PDFJ R Stat Soc Ser A Stat Soc
January 2025
Biostatistics, University of Michigan, 1415 Washington Heights, Michigan 48109, USA.
Model integration refers to the process of incorporating a fitted historical model into the estimation of a current study to increase statistical efficiency. Integration can be challenging when the current model includes new covariates, leading to potential model misspecification. We present and evaluate seven existing and novel model integration techniques, which employ both likelihood constraints and Bayesian informative priors.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!