The Risk Institute Online

During the COVID-19 pandemic, we have moved our resources and events online. Follow this page to keep up to date with what's happening in the Risk Institute.

Subscribe to receive email notifications and keep up to date with what's going on in the Risk Institute.

SCHEDULE

Wednesday 27th May - 14:00

Dan Rozell

Wednesday July 1st - 14:00

Michael Balch

Wednesday July 8th - 14:00

Jürgen Hackl

Wednesday July 15th - 14:00

Alan Calder

Wednesday July 22nd - 14:00

Michael Balch

Wednesday July 29th - 14:00

Renata Schiavo

RIO Group Meetings

Wednesdays, 11am (BST)

We have group meetings every Wednesday morning for informal conversation about whatever you want! Conversation varies from That Interesting Paper That Made You Spill Your Coffee to controversial twitter takes, or how you're faring in lockdown. It's drop-in/drop-out with no pressure to attend for the entire time.

RELATED NEWS

Subscribe to the Risk Institute Online

Wednesday 27th May

Dan Rozell: Technological Risk Attitudes in Science Policy

Daniel Rozell has two decades of experience in the fields of engineering and science working in private industry and for public regulatory agencies. Dr. Rozell holds an affiliation as Research Professor in the Department of Technology and Society at Stony Brook University in New York. His recent book, Dangerous Science: Science Policy and Risk Analysis for Scientists and Engineers is available open access at Ubiquity Press

Science and technology policy decisions must often be made before there is sufficient data, widely accepted theories, or consensus in the scientific community. Furthermore, what constitutes credible science is sometimes itself a contentious issue. The result is that we frequently encounter science and technology policy debates where well-intentioned and reasonable individuals can arrive at different conclusions. In the face of inconclusive data, people tend to evaluate new information using heuristics that include their pre-existing attitudes about science and technology.

Read the talk notes here, or watch the full talk below.

Wednesday 10th June

Yan Wang: Generalized Interval Probability and Its Applications in Engineering

Yan Wang, Ph.D. is a Professor of Mechanical Engineering at Georgia Institute of Technology. He is interested in multiscale systems engineering, modeling and simulation, and uncertainty quantification, and has published over 90 archived journal papers and 80 peer-reviewed conference papers. He recently edited the first book of its kind on uncertainty quantification in multiscale materials modeling

Uncertainty in engineering analysis is composed of two components. One is the inherent randomness because of fluctuation and perturbation as aleatory uncertainty, and the other is epistemic uncertainty due to lack of perfect knowledge about the system. Imprecise probability provides a compact way to quantify and differentiate the two components, where the probability measures randomness and the interval range quantifies the imprecision associated with the probability. Several forms of imprecise probability have been proposed such as Dempster-Shafer theory, coherent lower prevision, p-box, possibility theory, fuzzy probability, and random set. To simplify the computation for engineering analysis, we introduced generalized interval probability where the interval bounds take the form of directed or modal interval instead of classical set-based interval. Interval calculation is based on the more intuitive Kaucher interval arithmetic. Generalized interval probability has been applied in studying stochastic dynamics, hidden Markov model, Kalman filter, random set sampling, and molecular dynamics simulation.

Read the talk notes here, or watch the full talk below.

Wednesday 17th June

Jurgen Hackl: Complex Infrastructure Systems: Intelligent risk and resilience assessments

Slider 1

Dr Jürgen Hackl is a Lecturer at the University of Liverpool and a member of the Data Analytics Group at the University of Zurich. He received his doctorate in Engineering Science from the ETH Zurich in July 2019. His research interests lie in complex urban systems and span both computational modelling and network science. Much of his work has been on improving the understanding, design, and performance of complex interdependent infrastructure systems, affected by natural hazards. Presently, he works on getting a better understanding of how the topology of the system influences dynamic processes and how this can be used to decrease the complexity of computational models. In order to transfer this knowledge to the industry, he co-founded the start-up Carmentae Infrastructure Management, helping infrastructure managers in their decision-making processes. Furthermore, he has a long history of supporting a sustainable digital world by developing and maintaining various open-source projects.

Dr Hackls' presentation focuses on complex infrastructure systems (such as transportation and supply chains), intelligent risk and resilience assessments for climate change, and integrated solutions to future challenges facing our cities and society. To gain a deeper understanding of such complex systems, new mathematical approaches and computational models are needed. In order to achieve this, we have to go beyond the classical boundaries of the individual disciplines and work in an interdisciplinary team. In this sense, research on smart mobility and smart cities have been developed as new research areas.

The aim of this presentation is to give an overview how complex infrastructure systems are currently modelled; how novel network analytic methods for spatial-temporal networks can be utilized to gain a better understanding of our complex urban environment; how advances in data analytics and machine learning provide us new ways to extract knowledge and support decision-making processes; as well as how cloud-based simulations might offer a solution for computational risk and resilience assessments of complex infrastructure systems.

Monday 22nd June

Noémie Le Carrer: Making sense of ensemble predictions in weather forecasting: Can possibility theory overcome the limitations of standard probabilistic interpretations?

Slider 1

Ensemble forecasting is widely used in weather prediction to reflect uncertainty about high-dimensional, nonlinear systems with extreme sensitivity to initial conditions. Results are generally interpreted probabilistically but this interpretation is not reliable because of the chaotic nature of the dynamics of the atmospheric system as well as the fact that the ensembles were not actually generated probabilistically. We show that probability distributions are not the best way to extract the information contained in ensemble prediction systems. A more workable possibilistic interpretation of ensemble predictions takes inspiration from fuzzy and possibility theories. This framework also integrates other sources of information such as the insight on the local system’s dynamics provided by the analog method and provides more meaningful quantitative results.

Wednesday 24th June

Ryan Martin: False confidence, imprecise probabilities, and valid statistical inference

Dr. Ryan Martin is a Professor in the Department of Statistics at North Carolina State University, USA. His research interests include asymptotics, empirical Bayes analysis, high- and infinite dimensional inference problems, foundations of statistics, imprecise probability, mixture models, etc. He is co-author of the monograph Inferential Models and co-founder of the Researchers.One peer review and publication platform.

Despite remarkable advances in statistical theory, methods, and computing in the last 50+ years, fundamental questions about probability and its role in statistical inference remain unanswered. There is no shortage of ways to construct data-dependent probabilities for the purpose of inference, Bayes being the most common, but none are fully satisfactory. One concern is the recent discovery that, for any data-dependent probability, there are false hypotheses about the unknown quantity of interest that tend to be assigned high probability -- a phenomenon we call false confidence -- which creates a risk for systematically misleading inferences. Here I argue that these challenges can be overcome by broadening our perspective, allowing for uncertainty quantification via imprecise probabilities. In particular, I will demonstrate that it is possible to achieve valid inference, free of false confidence and the associated risks of systematic errors, by working with a special class of imprecise probabilities driven by random sets. Examples will be given to illustrate the key concepts and results, and connections between this new framework and familiar things from classical statistics will be made.

Wednesday 1st July

Michael Balch: Numerical Methods for Propagating Confidence Curves

Slider 1

Michael Balch is the Technical Lead at Alexandria Validation Consulting, LLC. He designs the algorithms underpinning our software and renders all consulting services personally. Dr. Balch has twelve years of experience as a research-practitioner specializing in uncertainty quantification. He has worked on applications spanning engineering, medicine, defense, and finance. His career has included time as a contractor at both NASA Langley and AFRL Wright-Patterson. He received his Doctorate in Aerospace Engineering from Virginia Tech in 2010.

Confidence curves—aka consonant confidence structures, aka inferential models—fuse the comprehensiveness and flexibility of Bayesian inference with the statistical performance and rigor of classical frequentist inference. Rooted in possibility theory, these structures visualize the long-known connection between confidence intervals and significance testing. More importantly, they enable the statistically reliable assignment of belief to propositions (or sets, hypotheses, etc.) about a fixed parameter being inferred from random data. This presentation explores a Monte Carlo approach to propagating these structures through black-box functions, a necessity if these methods are to be widely applied in engineering work.

Further discussion with Michael Balch about his talk

Wednesday 8th July

Jürgen Hackl: ABM (agent-based modeling) simulations of epidemic spreading in urban areas

Human mobility is a key element in the understanding of epidemic spreading. Thus, correctly modeling and quantifying human mobility is critical for studying large-scale spatial transmission of infectious diseases and improving epidemic control. In this study, a large-scale agent-based transport simulation (MATSim) is linked with a generic epidemic spread model to simulate the spread of communicable diseases in an urban environment. The use of an agent-based model allows reproduction of the real-world behavior of individuals’ daily path in an urban setting and allows the capture of interactions among them, in the form of a spatial-temporal social network. This model is used to study seasonal influenza outbreaks in the metropolitan area of Zurich, Switzerland. The observations of the agent-based models are compared with results from classical SIR models. The model presented is a prototype that can be used to analyze multiple scenarios in the case of a disease spread at an urban scale, considering variations of different model parameters settings. The results of this simulation can help to improve comprehension of the disease spread dynamics and to take better steps towards the prevention and control of an epidemic

Wednesday 15th July

Alan Calder: Resolving Thermonuclear Supernovae

Slider 1

Alan Calder is associate professor in Physics and Astronomy at Stony Brook University in New York, working in the nuclear physics of explosive astrophysical phenomena. With his extensive background in large-scale computing, he is deputy director of the Institute for Advanced Computational Science. He has held research appointments at the National Center for Supercomputing Applications and the University of Chicago at the Center for Astrophysical Thermonuclear Flashes. His research is principally in bright stellar explosions known as Type Ia supernovae which produce and distribute heavy elements and are therefore important for galactic chemical evolution, and whose light curves can be used as distance indicators for cosmological studies of the expansion of the universe. His simulations explore how stellar age and composition affect the event’s brightness which is critical to addressing their variability, which is a source of significant uncertainty in cosmology.

Slider 1

Thermonuclear (Type Ia) supernovae are bright stellar explosions distinguished by light curves that can be calibrated to allow for their use as "standard candles" for measuring cosmological distances. While many fundamental questions remain, it is accepted that the setting of these events involves a white dwarf star (or two), and that the explosion is powered by explosive thermonuclear burning under degenerate conditions. Modeling these events presents a challenge because the outcome of an event sensitively depends on the details of the physics occurring on scales orders of magnitude smaller than the star. Such "microphysics" includes nuclear burning, fluid instabilities, and turbulence. I will give an overview of our understanding of thermonuclear supernovae and describe our approach to capturing these sub-grid-scale processes in macroscopic simulations.

Wednesday 22nd July

Michael Balch: Beyond False Confidence

For those who take frequentist notions of reliability seriously, normative statistical inference remains an unresolved challenge. For any one problem, there are multiple solutions that satisfy the Martin-Liu validity criterion. Some of these are obviously more efficient than others, but the vigorous pursuit of efficient and reliable inference can yield counter-intuitive results. This presentation explores three counter-intuitive phenomena that can arise in the use of confidence curves. Two of these phenomena hint at the need for additional constraints on statistical inference, beyond simple reliability.