##
Previous seminars - 2017/18 academic year

**Wednesday 17 January: Acoustic-gravity waves, theory and applications**

- Speaker: Usama Kadri, Cardiff

**Abstract:** Acoustic–gravity waves (AGWs) are compression-type waves generated as a response to a sudden change in the water pressure, e.g. due to nonlinear interaction of surface waves, submarine earthquakes, landslides, falling meteorites and objects impacting the sea surface. AGWs can travel at near the speed of sound in water (ca. 1500 m/s), but can also penetrate through the sea-floor surface amplifying their speed, which turns them into excellent precursors. “Acoustic–gravity waves” is an emerging field that is rapidly gaining popularity among the scientific community, as it finds broad utility in physical oceanography, marine biology, geophysics, water engineering, and quantum analogues. This talk was an overview on AGWs, with emphasis on recent developments, current challenges, and future directions.

**Wednesday 31 January: The Dawn of FIMP Dark Matter**

- Speaker: Tommi Tenkanen, QMUL

**Abstract:** Tommi presented an overview of scenarios where the observed Dark Matter (DM) abundance consists of Feebly Interacting Massive Particles (FIMPs), produced non-thermally by the so-called “freeze-in” mechanism. In contrast to the usual freeze-out scenario, frozen-in FIMP DM interacts very weakly with particles in the visible sector and never attained thermal equilibrium with them in the early Universe. This makes frozen-in DM very difficult but not impossible to test. In this talk Tommi presented the freeze-in mechanism and its variations previously considered in the literature, compare them to the standard DM freeze-out scenario, discuss several aspects of model building, and pay particular attention to observational properties of such feebly interacting DM.

**Wednesday 21 February: Smoothed Particles Hydrodynamics Modelling in Free Surface Single Phase Flows**

- Speaker: Ruaa Wana, Plymouth

**Abstract:** Smoothed Particle Hydrodynamics (SPH) is a meshfree, Lagrangian, particle method. It is particularly well suited to simulating flow problems that have large deformations or contain free surfaces. This talk discussed the SPH single phase simulations of a dam break flow and a tsunami wave generated by a fault rupture.

**Wednesday 21 February: Recent results from LHCb**

- Speaker: Jonas Rademacker, Bristol

**Abstract: **The LHCb experiment at CERN has collected an unprecedented data set in beauty and charm decays. The resulting precision measurements are sensitive to physics at mass scales far beyond the “energy frontier”, i.e. the highest collision energies achievable in colliders. Recent results indicate that these data are beginning to challenge the SM of particle physics. This seminar summarised recent LHCb results, with focus on precision measurements of Charge-Parity violation and rare decays.

**Wednesday 28 February: Charge Propagation**

- Speaker: David McMullan, Plymouth

**Abstract:** Charge propagation is surprisingly difficult to describe in quantum field theory. Even in the abelian theory, where the spectre of confinement is avoided, the persistent interaction of the electron with its environment leads to deep theoretical problems. There is, though, an exactly solvable model of an electron in a plane wave background (the so called Volkov solution) that provides an attractive arena for testing ideas related to the more general propagation problem. David reviewed some of the recent results we have derived concerning charges propagating in an elliptical class of backgrounds, and outline how to incorporate loop corrections. Some of the lessons learnt from this were discussed in the context of the more general problems that arise when charges propagate.

**Thursday 8 March: Rigidity and global rigidity of graphs**

- Speaker: Bernd Schulze, Lancaster

**Abstract:** Rigidity theory is concerned with the rigidity and flexibility analysis of bar-joint frameworks and related constraint systems of geometric objects. This area has a rich history which can be traced back to classical work of Euler, Cauchy and Maxwell on the rigidity of polyhedra and skeletal frames. Since Laman’s celebrated result from 1970 (which provided the first combinatorial characterisation of generic rigid bar-joint frameworks in the plane), rigidity theory has received steadily increasing attention and it is now a highly diverse and thriving research area with many practical applications. Bernd gave an introduction to rigidity theory, concentrating on results and problems for bar-joint frameworks, but also describing how these have been extended to some other types of frameworks. Moreover, Bernd summarised some recent progress in the rigidity analysis of symmetric frameworks.

**Tuesday 20 March: Implementation of a Lattice Boltzmann Method for Multiphase Flows with High Density and Viscosity Ratios**

- Speaker: Norjan Jumaa, Plymouth

**Abstract:** We presented a Lattice Boltzmann Method (LBM) for multiphase flows with high viscosity and density ratios. Following Banari et al. (2014), the motion of the interface between fluids is modelled by solving the Cahn-Hilliard (CH) equation with LBM. Incompressibility of the velocity fields in each phase is imposed by using a pressure correction scheme. We used a unified LBM approach with separate formulations for the phase field, the pressure-less Navies-Stokes (NS) equations and the pressure Poisson equation required for correction of the velocity field. The implementation has been verified for various test cases. Here, we presented results for some complex flow problems including two dimensional single and multiple mode Rayleigh-Taylor Instability (RTI). Also, we presented the evolution of the height of a standing wave for both high and low viscosity and density ratios.

**Wednesday 21 March: Multi-state models for observed and latent cognitive function in the older population**

- Speaker: Ardo van den Hout, Department of Statistical Science, University College London

**Abstract:** Due to the ageing population there is a growing interest in the statistical modelling of cognitive function in old age. When analysing longitudinal data on ageing, lost to follow-up because of death cannot be ignored. One option is to model survival and change of cognitive function jointly by specifying submodels for the two processes and linking these models by individual-specific random effects. Another option – and the topic of this seminar – is to use a continuous-time multi-state survival model where a series of living states is defined by the level of cognitive function and an additional dead state is included. This multi-state approach is quite general and can be used in many other applications in biostatistics, social statistics, and demography.

The seminar started with an introduction to the continuous-time multi-state survival model by discussing model specification and maximum likelihood estimation. The second part presented an extension of current methods: a hidden Markov model for modelling bivariate cognitive function. The methods were illustrated by using longitudinal data from a UK survey of the older population.

**Wednesday 21 March: Accuracy and Stability of Virtual Source Method for Numerical Simulations of Nonlinear Water Waves**

- Speaker: Omar Al-Tameemi, Plymouth

**Abstract:** The Virtual Source Method (VSM) is based upon the integral equations derived by using Green’s identity with Laplace’s equation for the velocity potential. The velocity potential within the fluid domain is completely determined by the potential on a virtual boundary located above the fluid, avoiding the need to evaluate the singular integrals normally associated with integral equation methods. This talk presented numerical simulations of non-linear standing waves and sloshing problems using VSM. We discussed stability and convergence of the method as well as looking at global and energy and volume conservation.

**Wednesday 21 March: Simulating dipole wall collisions with slip boundaries by using moment-based boundary conditions for lattice Boltzmann method**

- Speaker: Seemaa Mohammed, Plymouth

**Abstract:** The accuracy of moment-based boundary conditions for no slip and slip walls in the lattice Boltzmann method is examined numerically by using dipole wall collision for both normal and oblique cases. To do that hydrodynamic moments impose to the boundary instead of the distribution functions. For accurate results a two relaxation time (TRT) model are used with moment boundary conditions. An excellent agreement with a benchmark data is obtained and the results are obtained to converge with second order accuracy.

**Wednesday 25 April: Integrability of Relativistic Dynamical Systems**

- Speaker: Tom Heinzl, Plymouth

**Abstract:** Tom gave a brief overview of our recent studies of relativistic dynamical systems. These are described by the Newton-Einstein-Lorentz equation generalising F = ma in the presence of external (electromagnetic) forces. Tom explained how space-time and conformal symmetries may be employed to find conserved quantities that lead to integrability. In quite a few cases, the number of these quantities is surprisingly large implying super-integrability. This feature is somewhat ‘exotic’ due to its rare occurrence. For instance, by Bertrand’s theorem, in classical mechanics it only holds for the Kepler problem (with its Laplace-Runge-Lenz vector) and the harmonic oscillator. Using our generalised setting we have been able to extend this list significantly.

**Wednesday 25 April: Double copy on plane wave backgrounds**

- Speaker: Tim Adamo, Imperial

**Abstract:** Double copy is a method for constructing (perturbatively) scattering amplitudes in gravity from scattering amplitudes in gauge theory, and even has a catchy slogan: Gravity = (Gauge theory)^2. Despite is many uses, the status of double copy as a fundamental feature of perturbative field theories is poorly understood. In this talk Tim described how the robustness of double copy can be tested by considering the simplest scattering amplitudes on plane wave backgrounds. This makes use of several tools familiar from the study of QED in strong background fields. Despite various subtleties introduced by the non-trivial scattering background, it is clear that a notion of double copy does indeed exist.

**Wednesday 25 April: Trust in Numbers**

- Speaker: Professor Sir David Spiegelhalter, President of the Royal Statistical Society for 2017-18

**Abstract:** Those who value quantitative and scientific evidence are faced with claims both of a reproducibility crisis in scientific publication, and of a post-truth society abounding in fake news and alternative facts. Both issues are of vital importance to statisticians, and both are deeply concerned with trust in expertise. By considering the ‘pipelines’ through which scientific and political evidence is propagated, David considered possible ways of improving both the trustworthiness of the statistical evidence being communicated, and the ability of audiences to assess the quality and reliability of what they are being told. There were also cheap laughs at numerous examples of disastrous communication of statistics in the media.

**Wednesday 2 May: Guiding laser-produced fast electrons using super-strong magnetic fields**

- Speaker: Kate Lancaster, York

**Abstract:** Currently we are able produce with laser plasma interactions some of the most extreme conditions on earth. When ultra-intense lasers are focused on to solid material, the fields associated with the laser are so strong that electrons can easily escape the atoms in the material. Absorption of the laser pulse results in the generation of a population of relativistic electrons, with currents on the order of Mega Amps. The physics associated with how the electrons are produced and subsequently transported in plasma is complex and proves challenging to diagnose and study. Importantly, these fast electrons are the driver for much of the subsequent physics during these interactions including generation of energetic particles/ photon sources, unique atomic physics states such as hollow atoms, hydrodynamic phenomena, production of warm / hot dense matter relevant to stellar interiors, heating of matter relevant to alternative laser driven fusion schemes such as fast ignition, and conditions relevant for understanding of nuclear astrophysics in the most extreme objects in our universe.

This talk illustrated some of the experiments happening on petawatt-class lasers concerning how to control important fast electron beam parameters (such as divergence) using novel structured targets. Alex Robinson et al first proposed using targets incorporating a resistivity gradient to confine fast electrons. At the material interface of a high resistivity feature, e.g. a wire, surrounded by a lower resistivity material a strong magnetic field is generated which confines electrons to areas of higher resistivity and higher current density. In this talk experiments using targets with novel silicon embedded features created by Scitech Precision Ltd using MEMS technology were presented. A novel duel channel front surface imaging system was created in order to enable both pre-shot alignment and on-shot focal spot position, information critical for performing these types of complex experiments.

**Wednesday 9 May: Searching for dark matter with the LUX and LUX-ZEPLIN direct detection experiments**

**Abstract:** For the past decade liquid xenon time-projection chambers (TPCs) hidden deep underground have led the race to make a first direct detection of dark matter here on Earth. Jim presented the final results from the recently completed Large Underground Xenon (LUX) experiment as well as the status and physics reach of its successor, the 40 times as massive LUX-ZEPLIN experiment currently being constructed and due to start data taking in 2020.

**Wednesday 16 May: Progress on the connection between spectral embedding and network models used by the probability, statistics and machine-learning communities**

- Speaker: Patrick Rubin-Delanchy, University of Bristol

**Abstract:** Patrick gave theoretical and methodological results, based on work spanning Johns Hopkins, the Heilbronn Institute for Mathematical Research, Imperial and Bristol, regarding the connection between various graph spectral methods and commonly used network models which are popular in the probability, statistics and machine-learning communities. An attractive feature of the results is that they lead to very simple take-home messages for network data analysis: a) when using spectral embedding, consider eigenvectors from both ends of the spectrum; b) when implementing spectral clustering, use Gaussian mixture models, not k-means; c) when interpreting spectral embedding, think of “mixtures of behaviour” rather than “distance”. Results are illustrated with cyber-security applications.

**Wednesday 23 May: Flexible inference for continuous-time models of wildlife movement**

- Speaker: Paul Blackwell, University of Sheffield

The majority of statistical models of animal movement are formulated in discrete time, modelling separately each 'step' from one location (e.g. GPS fix) to the next. This can make it difficult to deal with missing or unequally-spaced observations, to compare studies with different time scales, or to interpret results biologically. In reality, animals exist and move in continuous time, and Paul described some switching diffusion models that try to capture some of the complexities of real behaviour in continuous time. Computational cost is an increasingly important issue in fitting movement models, and Paul talked about some algorithms that allow exact inference for such models, even in the presence of spatial heterogeneity, borrowing ideas from Hidden Markov Models and Kalman Filtering. An increasingly important area of application is collective movement, where we model the locations of simultaneously-tracked animals as they interact; Paul discussed some recent developments in modelling and computation for this situation.

Some of this work is joint with Mu Niu (University of Plymouth) and a number of recent or current research students at the University of Sheffield.

**Wednesday 23 May: Analytical estimates of proton acceleration in laser-produced turbulent plasma**- Speaker: Konstantin Beyer (Oxford)

**Abstract: **With the advent of high power laser facilities new opportunities for scaled laboratory experiments of astrophysical processes have become available. Here we showed that experiments at National Ignition Facility (NIF) laser have reached the condition where second order Fermi acceleration can be directly investigated with the available diagnostics. This requires measuring the diffusion of 3 MeV protons produced within a turbulent plasma. Since Fermi acceleration is essentially a biased diffusion process and using existing solutions, we show that a significant broadening of the initial proton distribution is expected for those particles existing the plasma.

**Thursday 31 May: Symplectic geometry of the moduli space of framed Higgs bundles**

- Speaker: Marina Logares, University of Plymouth

**Abstract: **Let X be a compact Riemann surface and D an effective divisor on X. The moduli space of D-twisted stable Higgs bundles on X is known to have a holomorphic Poisson structure which is in fact symplectic if and only if D is the zero divisor. We proved that this moduli space admits a natural enhancement to a holomorphic symplectic manifold. This is based on joint work in collaboration with I Biswas and A Peón-Nieto.

**Wednesday 6 June: Research students using statistics mini-symposium**

**The influence of external peer reviewers on funding decisions in grant applications in NIHR**

- Speaker: Lexy Sorrell, University of Plymouth

**Abstract: **The National Institute of Health Research (NIHR) is a large, long established national funder of health research, aiming to select applications for research funding which are of the highest quality and address important health issues, providing evidence for policy and practice. However, this selection process is time consuming and costly in terms of human resources: the NIHR staff, external reviewers and board members. This study is part of the wider ‘Push the Pace’ project aiming to reduce the time for research to get from ‘bench to beside’, which is currently averaging ten years in the NIHR research pathway. We used the external reviewers’ scores along with reviewer and application characteristics to investigate the influence of external peer reviewers on the funding decisions made by the board, how many reviewers are needed to review an application; and the relative value of peer review scores from different kinds of reviewers.

**Sonification: The Aesthetics of Listening to Data**

- Speaker: Nuria Bonet Filella, University of Plymouth

**Abstract:** Sonification is ‘the use of non-speech audio to convey information’; in other words, the aural display of data (as opposed to visual). It can be a superior approach for the understanding of data, where their visual interpretation is difficult. Sonification has mainly been used for scientific purposes (for example Geiger counter, EEG monitor or heart rate monitor) but composers are showing increasing interest in the method.

Musical aesthetics are crucial to sonification. In order to be effective in transmitting data, a sonification must be clear, appropriate to the data, and aesthetically pleasing to the listener. Many examples are ‘typically unpleasant and lacking any natural connection to the data represented’. Scientists might not have the musical knowledge to create a good sonification; composers might not have the scientific knowledge to deal with the data musically. Therefore, the production of sonification has strongly increased in recent years but their quality has not.

Nuria has approached sonification from a compositional point of view in order to establish an aesthetic framework for aural display. Through a portfolio of sonifications and theoretical work, Nuria has explored the practical aspects of listening to data, whether for scientists or composers.

**Friday 8 June: Thick subcategories of discrete derived categories**

- Speaker: Nathan Broomhead, University of Plymouth

**Abstract:** Nathan explained some work in which he describes the lattices of thick subcategories of discrete derived categories. This is done using certain generating collections of exceptional and sphere-like objects related to non-crossing configurations of arcs in a geometric model.

**Wednesday 13 June: Probabilistic Regression Analysis of Extreme Events in Energy Sectors with Either Massive or Small Data**

- Speaker: Keming Yu, Brunel University

**Abstract: **Buried pipelines and wind turbine are important devices to convert or transfer energy. Buried pipelines are vulnerable to the threat of corrosion. Of interest is an estimate of the probability when or where an affected pipeline is likely to fail from the extreme growth of a corrosion defect. Wind turbine monitoring uses acoustic emission signal detection of damage processes in the structure. Peak signals are something under the concern of the industry. Of interest is an estimate of the probability of a signal beyond a threshold. Many factors involved need to be taken into consideration when building a probabilistic model for these extreme events. But some classical regression models such as the logistic regression whose response is a binary variable seems inefﬁcient for an observable continuous-response. Furthermore, the probit regression may face either massive data to process or small size to apply, depending on different cases. Whatever the case, estimation accuracy with the support of sound statistical theory and computational algorithm is expected. This talk introduced a novel inference of probit regression for extreme events to cope with either massive streaming or small size data and showed that this objective may be achievable.

**Friday 22 June: The history of inflation measurement**

- Speaker: Jeff Ralph (RSS William Guy Lecturer for 2017-2018), Office of National Statistics

**Abstract: **The monthly consumer price inflation figures from the Office for National Statistics are among the most influential of all Official Statistics. They are used for a wide variety of important purposes from indicating the health of the economy to the adjustment of pensions and benefits. Behind the numbers sits a sophisticated methodology that has been developed over a long period of time. This talk looked at the development of these measures from their origins at the start of the 18th century to the current day. It identifed some of the visionary individuals who contributed to establishing the foundations, the dates when important changes were made and the social and political factors that drove the developments.

**Monday 16 July: ****Xmeta: a comprehensive tool-box for advanced meta-analysis - meta-analysis, publication bias, and beyond**- Speaker: Yong Chen, University of Pennsylvania

**Abstract: **Yong introduced a web-platform that they have been developing over the last few years, known as ‘xmeta’, which aims to facilitate advanced comprehensive meta-analysis for applied investigators within and beyond the Perelman School of Medicine at University of Pennsylvania. The functionality of this platform includes implementation of various multivariate meta-analysis for continuous and/or binary outcomes in randomised controlled trials, meta-analysis of diagnostic tests with/without gold standard, as well as methods to identify and correct for publication biases or small study effects. Yong also introduced the online-analysis feature, which allows applied investigators to conduct these analyses without programing by themselves. At the end, Yong talked about the future direction of incorporating semi-automated text mining to speed up the systematic review process.

**Monday 23 July: ****Methodological advances in evidence synthesis**

- Speaker: Orestis Efthimiou, University of Bern

**Abstract: **Network meta-analysis (NMA) is an extension of the usual (pairwise) meta-analysis. It is a statistical tool for synthesizing evidence obtained from studies comparing multiple competing interventions for the same disease. In this lecture, we went through some recent advances in the field. First, we discussed a new model for the NMA of binary outcomes. This model generalises the well-known Mantel-Haenszel method, and can be especially valuable for the case of rare events, e.g. when synthesising data on mortality or serious adverse events. The method has been implemented in R in freely available, easy-to-use routines. Second, we discussed models for including non-randomised studies in NMA. Non-randomised studies can reveal whether or not interventions are effective in real-life clinical practice and there is a growing interest in including such evidence in the decision-making process. Here we present and compare an array of alternative methods, and we apply some of the methods in previously published clinical examples. Finally, we discussed methods for individual participant data network meta-analysis (IPD-NMA). IPD are considered the gold standard in evidence synthesis, and inclusion of IPD in NMA offers unique advantages, such as increase in precision, decrease in heterogeneity, as well as the capacity to individualise the treatment according to a patient’s characteristics. We showcased our methods using an example from depression.

Note: This was a joint RSS South West Local Group and Exeter Health Statistics event.