The Bayes Factor is a Bayesian tool for comparing two hypotheses, which is gaining popularity in psychological research and being suggested to replace classical t-tests. However, the Bayes Factor requires the specification of a prior distribution for the parameter of interest, which cannot be done unambiguously. In many situations, when further research is needed, information is not complete. This problem can be solved in the context of imprecise probabilities by using only the available (incomplete) knowledge. In this approach, a set of prior distributions is used instead of a single prior, yielding a set of Bayes Factor results, which is called the Robust Bayes Factor. In my talk, I will present the result of a project, in which the Bayes Factor was generalized to imprecise probabilities in a two-sample context with normally distributed data. The effect size between the two groups serves as parameter of interest and its prior was modeled as a set of normal distributions.
Our Imprecise Tuesdays series hosts seminars and master classes, these events are open for anyone to attend, participate or present! We welcome contributions from academics, students, industry and business, or the public. These informal events consist of a seminar, talk, or master class in any area you find interesting… if you find it interesting we probably will too!
If you would like to propose a talk please contact email@example.com
Robust Bayes Factor.
Minimisation of the effect of aleatory uncertainties on dynamic systems by active control using the method of receptances.
This paper presents a method to reduce the effect of uncertainties on dynamic systems by means of active control. In the proposed approach, pole placement is performed iteratively using an optimisation algorithm with an objective function that includes the variance of the real and imaginary part of each of the system’s pole. The method is advantageous in that control gains are calculated using the method of receptances, which eliminates model form uncertainty since only measured receptance data is used. Moreover, variances are extracted through a polynomial chaos expansion, which requires fewer samples as opposed to other techniques. The method is demonstrated numerically on a simple multi-degree-of-freedom system. It is shown that active control can be used in a way that not only places the poles of the system but also reduces their spread. Furthermore, it is shown that it is possible to directly relate uncertainty in the poles to meaningful physical based uncertainty in the structural parameters.
Diego Estrada Lugo
An overview of Imprecise Probability, a review of the SIPTA summer school in Oviedo, Spain
Usually the presence of uncertainty attached to an event is represented by a probability distribution according to probability theory. This approach has proved to work just fine when all the parameters and data needed for the analysis are provided. However, when information is scarce, vague, or conflicting, a generalization of probability theory must be employed since it is hard to find a unique probability distribution. For this reason, imprecise probabilities become important because they can represent the available knowledge as well as provide the tools to model and work with weaker states of information. In this talk, I will briefly describe my impressions and experiences with Imprecise Probability methodologies during the summer school that took place in the Sciences Faculty of the University of Oviedo in Spain.
Diagnosis by Machine?
Abstract: Polymyalgia rheumatica (PMR) and giant cell arteritis (GCA) are inflammatory diseases that predominantly affect older people. PMR involves inflammation around the joints and GCA involves inflammation around the arteries. Both diseases are treated with long-term glucocorticoids (a type of steroids), which can be associated with serious toxicity particularly in this age group. Diagnosis is therefore high-stakes but PMR is so common that most patients never get referred to a specialist at all, but are diagnosed and treated by their GP. The diagnosis of PMR is based on clinical expertise/experience, as there is no single diagnostic test specific to PMR; also there is a lot of uncertainty around the condition of PMR at multiple levels, which is a particular problem if the GP has not seen much PMR before. Rheumatologists have attempted to solve this problem by generating “classification criteria” checklists, but these are not validated for clinical diagnosis. I propose that a clinical decision aid would make the risks and trade-offs in starting treatment for PMR more explicit, and would facilitate shared decision-making between doctor and patient. We have a rich dataset from a nearly-completed study of patients with suspected PMR (n=197) which might be used to help develop such a decision aid in order that in future doctors will be able to make better decisions for their individual patients.
Speaker: Sarah Mackie is an Associate Clinical Professor in Vascular Rheumatology, and Honorary Consultant Rheumatologis.
Her aim is to improve outcomes for patients with giant cell arteritis (GCA) and polymyalgia rheumatica (PMR). These conditions have historically not received much research attention but this is changing. We now have the opportunity to help patients receive a more rapid, accurate diagnosis and we are reaching a better understanding of the effects of long-term steroid treatment in these diseases. Advances in our knowledge, including the underlying basic science, are informing design of clinical trials of better treatment strategies in these diseases. Well-designed clinical trials will produce the evidence needed to change treatment pathways for the better.
DR Peter Green
Towards the validation of dynamical models in regions where there is no data
Abstract: Computer models are often created because we need to make extrapolations in regions where there is no data. This makes validation challenging - how can we ensure that the model is suitable if it is to be applied in a region where there is no measurement data? The current paper proposes a method which can reveal flaws in a model which may be difficult to identify by other calibration/validation approaches. It specifically targets the situation where one is attempting to model a dynamical system which, it is believed, possesses some time-invariant calibration parameters. With our method we essentially allow these parameters to vary with time, even though it is believed that they are time-invariant. It is through this approach that we aim to identify key discrepancies - indications that a model has inherent flaws and, as a result, should not be used to influence decisions in regions where there is no data. The proposed method isn't necessarily a predictor of extrapolation performance, rather, it is a stringent test that, the authors believe, should be applied before extrapolation is attempted. The approach could therefore form a useful part of wider validation frameworks in the future.
Speaker: Dr Peter Green (PG) became a lecturer in Uncertainty and Engineering at the UoL in 2015. His background is in structural dynamics, but he now develops uncertainty quantification and machine learning methods for engineering disciplines.
His current research sits between Big Data analytics, Machine Learning and multiple engineering disciplines. Application projects include span: data-based control strategies for Additive Manufacturing, machine-learnt rotorcraft dynamics models for deployment in flight simulators, robust optimisation of ship scheduling problems under uncertain weather conditions, characterising the risk of disproportionate collapse for cable-stayed bridges and analysing the robustness of structures subjected to blasts. Fundamental research addresses decision-making from large datasets and the validation of models in situations where data is sparse.
Prof Scott Ferson
Validation and predictive capability of imperfect models with imprecise data
Abstract: Many sophisticated models in engineering today incorporate randomness or stochasticity and make predictions in the form of probability distributions or other structures that express predictive uncertainty. Validation of such models must contend with observations that are often sparse or imprecise, or both. The predictive capability of these models, which determines what we can reliably infer from them, is assessed by whether and how closely the model can be shown to yield predictions conforming with available empirical observations beyond those data used in the model calibration process. Interestingly, a validation match between the model and data can be easier to establish when the predictions or observations are uncertain, but the model’s predictive capability is degraded by either uncertainty. It is critical that measures used for validation and estimating predictive capability not confuse variability with lack of knowledge, but rather integrate these two kinds of uncertainties (sometimes denoted ‘aleatory’ and ‘epistemic’) in a way that leads to meaningful statements about the fit of the model to data and the reliability of predictions it generates.
Speaker: Scott Ferson is director of the Institute for Risk and Uncertainty at the University of Liverpool in the UK. His research focuses on statistical tools when empirical information is very sparse or imprecise. This talk is a reprise of an address at the ASME 2018 Verification and Validation Symposium held two weeks prior in Minneapolis.
Power spectrum estimation of stochastic processes from bounded and gappy sensor data
Abstract: Sensors used to capture time-history data will never provide perfect digital reconstructions of the the processes they originally recorded. At best, a sensor will have an ideal working tolerance and defined accuracy bounds, and at worst will fail, leaving gaps in the data. When estimating power spectra from these data, it is important to consider the effect that such uncertainties could have on the output model. In this talk, some common missing data reconstruction techniques and their shortfalls will be presented in the context of power spectrum estimation, as well as methods to quantify power spectrum uncertainties under incomplete data.
Speaker: Liam Comerford graduated with a Bachelor in Aerospace Engineering from the University of Liverpool in 2009. He received his PhD in 2015 from the Institute for Risk and Uncertainty at the University of Liverpool. He then began his academic career as a Research Associate in Leibniz University Hannover, Germany, within the Institute Risk and Reliability. In May 2018 he returned to the UK to work in software development in the private sector. He currently maintains academic links through two European funded research projects in the areas of Stochastic Process Simulation and Compressive Sensing.
Time to join the blockchain: Introduction to blockchain technology and its applications
Abstract: Blockchain has been described as the most important invention of the millennium so far. Made famous by the explosion of the cryptocurrency Bitcoin, blockchain facilitates decentralisation of all manner data, communication and other transactions. Blockchain is not just about currency. Blockchain provides an infrastructure for communication between parties without needing to trust one’s counterpart in the interaction. It is already disrupting many industries, from traditional banks developing blockchain technology to handle correspondent transactions, to healthcare providers securely and quickly sharing patient medical records. The talk will include a discussion of other potential applications of the technology in our research.
Speaker: Dominic Calleja is a PhD student working in uncertainty qualification of plasma-facing components in future nuclear fusion reactors.
Dr Pranay Seshadri
Dimension reduction investigations in turbomachinery: Opportunities for uncertainty quantification & design under uncertainty
Abstract: Blades in modern jet engines are parameterized by 20-300 design variables. It is impossible to visualize, and more importantly, effectively explore such vast design spaces; such an exploration may be required not solely for optimization and uncertainty quantification studies, but more importantly for understanding the physics that underpins key designs characteristics. For instance, the designs that are associated with high efficiency are likely to be different from those with high flow capacity and high pressure ratios. These salient concerns motivate an output-based—i.e., efficiency, flow capacity or pressure ratio—based dimension reduction methodology.
In this talk, a new algorithm for subspace-based dimension reduction is introduced. It combines manifold optimization with classical Gaussian process regression. We draw parallels between our technique, active subspaces and the more general ridge functions. We test out approach on problems of 2, 25, 145 and 200 design variables. Then, we demonstrate how the computed dimension reducing subspace can be used for optimization, uncertainty quantification and more importantly for gaining physical insight into turbomachinery fluid mechanics.
Dr Donald Dyer
The Philosophy of Risk
Abstract: This presentation targets teaching/training personnel who have to explain to risk novices risk perceptions through multiple views of uncertainty. Because judgment is heavily influenced by perception, effective communication of uncertainty and risk requires a clearer understanding of their philosophical underpinnings. This presentation focuses on understanding the nature of aleatory, epistemic and ontological uncertainty and their impact on continuums of judgment.
Speaker: Dr Dyer has over twenty years of experience in project management spanning the financial services, education, agriculture and energy sectors focused on strategic information and communication technology (ICT) and organizational transformation initiatives. He has held strategic ICT roles on Wall Street having worked for both Goldman Sachs and Citi-Bank on their fixed income and equities trading floors as well as a senior project manager at the Inter-American Development Bank on monitoring and evaluation of enterprise ICT projects in education. He holds a doctorate in Business Administration from Grenoble École de Management in France, an MBA in Project Management from Henley Graduate School of Business at the University of Reading, and an undergraduate degree from Wilfrid Laurier University in Canada in Business Administration.
The impacy of water demand on the thermal power and water stations in the Kingdom of Bahrain
Abstract: The demand for the power and water in the kingdom of Bahrain is increasing with the fast development and population growth that the country is experiencing for the last few years. To supply this increasing demands (to ensure a power and water adequacy), plans have been set to expand the existing power and water facilities by building new power stations. However, critical studies are needed, to investigate the size of the expansion required and what type of energy technologies or resources can be introduced considering the decrease in the country natural gas resources.
The seasonal variation of power and water demand in the Kingdom of Bahrain has a summer demand increase of 114% and 9%, respectively. Therefore, modeling the power system with water production would require an understanding of the operational constraints of the steam turbines, which will reduce their output as the heat is extracted to be diverted to the thermal desalination unit. And due to the seasonal stable consumption of water compared to the substantial seasonal variation for power, this will introduce significant stress on the operation of the power system in general, forcing the system into a part load operation during the low power demand season.
Using uncertainty quantification techniques to complement numerical simulations
Abstract: Uncertainty, in terms of inlet conditions or data measurements almost always exists in real physical systems.
The impact of these uncertainties is sometimes so great that it becomes almost impossible to explicitly quantify them; the best one could then do, is to put bounds in terms of their retrospective output sensitivity levels. On the other hand, in numerical simulations, the deterministic input data somewhat ensures that the uncertainties are limited. However, the discretization, modelling and other numerical errors still contribute towards a combined uncertainty level in any simulation. Thus, one cannot rely upon a single deterministic simulation, especially since the input data is subject to intrinsic variability itself; the effects of which are generally not totally accounted. In short, to perform reliable numerical simulations one has to account for the combined uncertainty effects. This is known as uncertainty quantification (UQ) in computational fluid dynamics (CFD). With the UQ tools, statistical information can be derived from the deterministic numerical simulations, in-turn allowing one to establish stochastic responses of uncertainties. In the current presentation a few cases will be discussed showing how generalized Polynomial Chaos (gPC) technique can be used to not only create response surface for a range of input parameters but how in general one can use such techniques to complement CFD simulations.
Speaker: Imran Afgan is a lecturer in renewable energy systems / Thermal Hydraulics at the school of Mechanical Aerospace and Civil Engineering (MACE), University of Manchester. He is a specialist in massively parallel computing dealing with Large Eddy Simulations (LES) & Direct Numerical Simulations (DNS). He has a vast experience of CFD spanning over the last 15 years; working as PI/CoI on projects such as Engineering Sustainable Solar Energy and Thermocline Alternatives (ESSEnTiAl) funded by British council, Performance Assessment of Wave and Tidal Array System (PerAWAT) funded by DNV-GL and Extreme Loading of Marine Energy Devices due to Waves, Current, Flotsam and Mammal Impact (X-Med) funded by EPSRC. Between 2011 and 2012 he worked as the lead researcher on Reliable Data Acquisition Platform for Tidal (ReDAPT) funded and commissioned by Energy Technology Institute (ETI) and Computations for Advanced Reactor Engineering (CARE) funded by EPSRC. Before joining Manchester from 2009 to 2011 he worked at Électricité de France (eDF) and Université Pierre et Marie Curie (UPMC Paris 6) on High Performance Computing (HPC) and uncertainty quantification in Nuclear industry. He has also been the co-researcher on many EU funded projects such as Detached Eddy Simulation for Industrial Aerodynamics (DESider) and Helicopter Noise and Vibration Reduction (HELINOVI).