## Upcoming lecture

• Date/time: Wednesday 25 September
• Venue: Room 101, Rolle Building

The Teaching Statistics Trust Lecture 2019: The purpose of statistics is insight not numbers
• Speakers: Neil Sheldon

• Room 101, Rolle Building

• Stonehouse Lecture Theatre, Portland Square Building

## The Centre for Mathematical Sciences research seminars and events are listed below.

The four main seminar series are in applied mathematics, pure mathematics, statistics and theoretical physics. Visit the centre's webpages for the latest seminar updates and information.

Wednesday 25 September | Room 101, Rolle Building (16:00-17:30)

The Teaching Statistics Trust Lecture 2019: The purpose of statistics is insight not numbers

• Speaker: Neil Sheldon

Abstract: In recent years, statistics teaching has seen a welcome move away from formulae and calculation. Especially with the rise of ‘big data’, numerical processing is increasingly being done with software, and it is becoming much more important for students to learn the art and science of interpretation. This development requires teachers to change focus too, shifting their emphasis from numbers to language.

As with many academic disciplines, statistics overlays everyday language with specialist meaning: one familiar example is the word ‘significant’ which means very different things in everyday use and in statistics. Research shows that parallel meanings such as this make it harder for students to understand technical concepts. Research also shows that teaching with a richer vocabulary can help to overcome this problem of understanding.

But statistics is more than just an academic discipline, it is a vital element of citizenship: we all need statistical understanding to make sense of the world around us. Yet statistical data are routinely misunderstood and misinterpreted in the media. In most cases the errors arise, not from the numbers themselves, but from the confused and inaccurate language used to comment on them. Clear language is essential to clear thought.

This lecture, drawing on numerous practical examples, will explore the ways in which careful use of language can help everyone – teachers, students and citizens – to understand statistics better, whether in formulating enquiries, interpreting data, or reaching trustworthy conclusions and communicating them effectively.

Neil Sheldon was a teacher for more than 40 years. He is a Chartered Statistician and a former Vice-President of the Royal Statistical Society. He was the RSS Guy Lecturer in 2007-8 and he is currently Chair of the Teaching Statistics Trust. Neil’s other academic interests include philosophy and linguistics.

The Teaching Statistics Trust Lecture is given annually at multiple locations. It is aimed at teachers of statistics, whether specialist or non-specialist, in secondary schools, colleges and early years of university.

Contact yinghui.wei@plymouth.ac.uk for any queries.

Wednesday 30 October | Stonehouse Lecture Theatre, Portland Square Building

Pigeon-holes and mustard seeds: Growing capacity to use data for society

• Speaker: Professor Deborah Ashby, Imperial College London

The Royal Statistical Society was founded to address social problems ‘through the collection and classification of facts’, leading to many developments in the collection of data, the development of methods for analysing them, and the development of statistics as a profession. Nearly 200 years later an explosion in computational power has led, in turn, to an explosion in data. We outline the challenges and the actions needed to exploit that data for the public good, and to address the step change in statistical skills and capacity development necessary to enable our vision of a world where data are at the heart of understanding and decision-making.

Contact yinghui.wei@plymouth.ac.uk for any queries.

POctober 2019N
MonTueWedThuFriSatSun
30 1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31 1 2 3

Today's events

## Previous seminars - 2018/19 academic year

Tuesday 18 September (Teaching Statistics Trust Lecture 2018): Statistical Problem Solving: the Art and Science of Learning and Teaching from Data

• Speaker: Christine Franklin, School Statistics Ambassador, American Statistical Association

Abstract: After nearly 40 years as a statistics educator, Christine often reflects about her professional experience with learning and teaching statistics – remembering the past and feeling guilt about how poorly she must have taught her students those first years, trying to stay current with constantly changing pedagogy and assessment in the present, and making predictions about the future. How often do you reflect about your experience as a statistics teacher? Christine often reflects on what a great feeling it is to start each day knowing we work with students and colleagues aiming to see those light bulb moments of understanding the usefulness of data and statistical reasoning skills, but also the importance of being a healthy sceptic of the interpretation of small and big data we are often presented with.

What are the lessons we have learned that will positively impact the data and statistical literacy of our students in the future? Christine has been fortunate to collaborate with amazing colleagues through the years who have enlightened and inspired her to learn these lessons to continue the journey for advocating data and statistical literacy in our society.

In this presentation she discussed the art and science of learning and teaching from data (as she has experienced from writing four editions of Statistics: The Art and Science of Learning from Data and other resources specifically written for teachers) to help improve teaching statistical problem solving and data literacy at the school level.

Wednesday 19 September: Dispersion relation for equatorially-trapped internal water waves

• Speaker: Mateusz Kluczek (University College Cork)

Abstract: Mateusz presented a recently derived exact and explicit solution for the governing equations of geophysical water waves. Assuming no background in fluid mechanics, Mateusz introduced basic ideas behind internal water waves and explain some of the physical complexities of the model. Using numerical simulations, videos were presented of internal waves generated by the oscillation of interfaces inside the fluid body. Finally the dispersion relation was discussed, which is a mathematically elegant formula encoding rich physical information about the flow, particularly the wave speed, in terms of fixed physical parameters.

Wednesday 19 September: Branes on the singular locus of the Hitchin system via Borel and other parabolic subgroups

• Speaker: E Franco (CMUP)

Abstract: The moduli space of Higgs bundles has an extremely rich geometry, it is a hyperKaehler variety and fibrates over a vector space becoming an integrable system named the Hitchin system. Its importance in theoretical physics comes from the fact that the dimensional reduction of an N=4 Super Yang-Mills gauge theory can be rewritten as a 2-dimensional sigma-model with the Hitchin system as a target. In this context, Kapustin and Witten reinterpreted the classical limit of S-duality of the original SYM gauge theory as mirror symmetry in the target (the Hitchin system). They also introduced the appropriate notion of branes in the Hitchin system respecting the hyperKaehler structure. In this talk the behaviour under mirror symmetry of a family of branes living on the singular locus of the Hitchin system was studied. their geometry and the role of the Borel subgroup were also described. The picture can be generalised to other parabolic subgroups.

Wednesday 26 September: Flowing to minimal surfaces

• Speaker: Melanie Rupflin (Oxford)

Abstract: Melanie discussed the construction and properties of a geometric flow, the Teichmueller harmonic map flow, that is designed to change surfaces into minimal surfaces. She explained this flow, which we defined as a natural gradient flow of the Dirichlet energy, succeeds in decomposing any closed surface in any compact target manifold into minimal surfaces. This is joint work with Peter Topping.

Wednesday 3 October: Evolution of magnetic field structures in non-linear MHD dynamos

• Speaker: Daniel Miller (Exeter)

Abstract: ABC flows are an exact solution to the MHD equation of motion. As such they provide an ideal testing ground for examining changes to the magnetic field during the saturation process. Daniel compared how the saturation process affects magnetic field structures in dynamos with and without stagnation points in their forcing.

Wednesday 3 October: Gravity with more or less gauging

• Speaker: Steffen Gielen (Nottingham)

Abstract: General Relativity is conventionally formulated as a theory with gauge invariance under the diffeomorphism group of general coordinate transformations, but there are locally equivalent formulations in terms of either a larger (additional local conformal invariance) or smaller (only “special” diffeomorphisms) group of symmetries. Other formulations with the same number of gauge generators, but a different gauge algebra, also exist. We discussed how one can relate these different formulations to each other, and illustrated various applications in which one may prefer one or another formalism. (The talk was mostly based on arXiv:1805.11626.)

Wednesday 17 October: Mini-symposium in medical statistics

Practical issues in medical statistics

• Speaker: Gavin Koh (GSK)

Abstract: Statistical methods provide a variety of options for designing a clinical trial: for example, superiority, equivalence and non-inferiority. Selecting a design depends not just on statistical considerations, but contradictory clinical, regulatory and ethical considerations. After completing a study, secondary multivariate analyses are often performed and further choices need to be made: Which exposure group should be chosen as comparator? Or should you fit a no-intercept model? How should the independent variables be parameterised? Tafenoquine is a new chemical entity being developed as a novel treatment for malaria. Gavin illustrated these questions using data from a recently completed phase III trial of tafenoquine in malaria. This was an interactive seminar with the audience asked to suggest and justify design solutions.

Dietary intake in the early years and its relationship to BMI in a bi-ethnic group: the Born in Bradford 1000 study

• Speaker: Samuel Mahoney (Covance)

Abstract: The number of infants, toddlers and children who were overweight increased from 32 million globally in 1990 to 42 million in 2013. This figure is predicted to rise to 70 million by 2025. In the UK it is estimated by 2020 that 20% of all boys and 33% of all girls will be obese. Using data from the Born in Bradford 1000 cohort this study aimed to assess relationships between dietary intake at age 12, 18 and 36 months and BMI Z-scores at age 36 months in a bi-ethnic group. Results indicate that dietary intake at 18 and 36 months was somewhat related to BMI Z-score at age 36 months and suggest the importance of early interventions aimed at establishing healthy eating behaviours.

Wednesday 24 October: Non-linear generation of infragravity waves in deep waters

• Speaker: Teodor Vrecica (Tel Aviv University)

Abstract: Infragravity (IG) waves are commonly defined as sea surface gravity waves whose frequency is lower than that of the wind sea (0.05 Hz) and higher than that of the tides and internal waves (0.005 Hz). They are important for various aspects in oceanography and marine engineering such as: estimations of sediment transport and harbor resonances, altimetry measurements, the breaking of the ice sheet in the Pacific and Earth’s hum. Their primary known generation mechanism is the nonlinear shoaling of the wave field. Therefore, the focus of most previous related works was limited to coastal areas. Yet, details of the generation, and specially their directional properties, are still not fully understood. Results of recent field measurements confirmed the existence of IG wave climate in deeper waters. A common assumption is that the origin of deep water IG waves is the reflection from coastlines (leaky waves), however not every occurrence can be explained in this manner. Here, we presented a new triad interaction mechanisms for IG wave generation in deep water.

For steady homogeneous deep water wave fields, three wave interactions only produce steady non-resonant interactions. However, for evolving seas waves are able to resonate with changes of the wave field in time and space to yield mean energy transfer to the IG frequency range. The considered effects include simple growth of the wave field, effects of gustiness, and whitecapping. A new model for IG wave generation is constructed, which takes these effects into account. It is used to evaluate several storm events, where data obtained from archived reanalysis is used. Model results are compared to measurements of deep water pressure gauges with a good capability of describing the directional properties of the IG frequency range. The presented work sets the basis for future formulation of an IG wave source term for extending IG wave forecasting models to the deep waters.

This was a joint seminar with CPRG.

Wednesday 7 November: Perspectives on data, information and mathematics

• Speaker: Arieh Iserles (Cambridge)

Abstract: The data and information revolution is changing our lives: the way we socialise, shop, elect our leaders and conduct our research. Its impact ranges across all different academic disciplines. Yet, its engine room is mathematics — a set of emerging methodologies in statistics, computation and pure mathematics. In this talk Arieh attempted to explain in a non-technical manner this New Brave World, demystify phrases like “deep learning”, “imaging”, “sparse recovery” and “inverse problems”, describing how mathematics is transforming “Big Data” and how “Big Data” is transforming mathematics.

Wednesday 21 November: Types of embedded graphs and their Tutte polynomials

Speaker: Stephen Huggett (Plymouth)

Abstract: We took an elementary and systematic approach to the problem of extending the Tutte polynomial to the setting of embedded graphs. Four notions of embedded graphs arise naturally when considering deletion and contraction operations on graphs on surfaces. We gave a description of each class in terms of coloured ribbon graphs. We then identified a universal deletion-contraction invariant (that is, a Tutte polynomial) for each class. This is joint work with Iain Moffatt.

Wednesday 28 November: The Centre Problem in 3-Dimensional Systems and Applications

• Speaker: Lingling Liu (Chengdu, Sichuan)

Abstract: For high dimensional systems, it is a well-known method to restrict the system to its a centre manifold, but the approximation of the centre manifold brings a greater complexity in the computation of Lyapunov quantities and their dependence even if the original system is not of high degree. Furthermore, in order to determine if a centre-focus equilibrium is a centre, criteria for planar systems, such as time reversibility and integrability, are not available on the approximated centre manifold. In spite of this, we have recently been able to obtain some good results for integrability in a 3-dimensional system. We do this by finding a global centre manifold of the system and reducing the higher dimensional system to the global centre manifold. This result gives a useful method for identifying a focus or centre in high dimensional systems.

Friday 7 December: Certified numerical implementation of Zariski-Van Kampen method

• Speaker: Miguel Marco (Universidad de Zaragoza)

Abstract: Van Kampen’s motivation for his celebrated theorem was to give a proof of a fact known by Zariski: that the fundamental group of the complement of a complex plane curve is given by its braid monodromy. Moreover, Lefschetz hyperplane theorem allows to reduce to this case the computation of the complement of every hypersurface in the projective (or affine) space. However, there are no purely algebraic methods to compute this braid monodromy. We present a numerical, but certified, method using interval arithmetic and Newton’s interval criterion. The use of interval arithmatic allows us to work with arbitrary coefficients (either rational, algebraic or even transcendental).

Wednesday 30 January: Flavour physics in the $D_{(s)}$ and $B_{(s)}$ meson system

• Speaker: Tobias Tsang (Edinburgh)

Abstract: After a brief motivation, Tobias presented present Edinburgh's (domain wall fermion) charm and bottom physics programme. He first focused on their very recent computation (arXiv:1812.08791) of SU(3) breaking ratios in the $D_{(s)}$ and $B_{(s)}$ meson systems and the ratios of CKM matrix elements $V_{cd}/V_{cs}$ and $V_{td}/V_{ts}$. Tobias then outlined the current status of their wider heavy flavour phenomenology program in the charm and bottom sector.

Wednesday 6 February: Lattice study of phase properties of cold dense quark matter

• Speaker: Aleksandr Nikolaev (Swansea)

Abstract: The phase diagram of QCD is of fundamental interest for high energy physics, cosmology and astrophysics, but at the present moment ab-initio calculations in the lattice QCD formalism at finite density are impossible. However, in the regions of high temperature and small density or of high density and small temperature theories with SU(2) and SU(3) gauge groups with the presence of fundamental fermions are expected to have similar properties. SU(2) theory in lattice formulation does not possess a sign problem, which makes computations at finite density possible. Aleksandr presented results for the phase structure of lattice SU(2) QCD with two flavours of quarks at finite quark density and zero temperature. Interaction properties of quarks, real-time inter-quark potential and the confinement/deconfinement transition were examined. Our results indicated that in very dense matter the quark-gluon plasma is in essence a weakly interacting gas of quarks and gluons without a magnetic screening mass in the system, sharply different from a QGP at large temperature. The talk was based on papers arXiv:1808.06466, 1711.01869, 1605.04090, and recent results.

Wednesday 13 February: Pupil Advantage Index as an Alternative to Subgroup Analysis in RCTs for Education

• Speaker: ZhiMin Xiao (University of Exeter)

Abstract: Analyses of social interventions need to produce evidence that is relevant to different groups of people in a society. When such a group is not the target group of an intervention, this is called subgroup analysis, even when the group of interest is pre-specified prior to data collection. Amongst statisticians, subgroup analysis is often regarded as a statistical malpractice, as its findings are often underpowered, unreliable, and can be prone to overinterpretation at best, or misleading at worst. Meanwhile, researchers would be criticised for generating irrelevant evidence and accused of wasting research money if they do not conduct relevant subgroup analysis. As a result, “they are damned if they do, and damned if they don’t” (Petticrew et al., 2012).

In this study, we estimated intervention effects for Free School Meal (FSM) pupils in English schools, which is a pre-specified subgroup in most educational interventions funded by the Education Endowment Foundation (EEF) in England. Specifically, we first ran a treatment-FSM interaction test in each and every outcome to see if the difference-in-effects is statistically significant between FSM and Non-FSM pupils. We then calculated separate effect sizes within the two subgroups. Finally, we examined the p-values from the interaction tests and compared the overall effect sizes for both FSM and Non-FSM pupils with the two separate subgroup estimates. We found that conventional interaction tests can produce self-contradictory results. To help solve the problem, we propose a new approach, Pupil Advantage Index (PAI), as an alternative to subgroup analysis and apply it to real RCT data extracted from the EEF data archive. We demonstrate that PAI does not just indicate where an intervention worked and by how much in existing trials, but it can also be utilised to optimise treatment recommendation for future interventions.

Wednesday 20 February: Modelling numbers of births by day of the week in relation to onset of labour and mode of giving birth in England 2005–2014 (Mario Cortina Borja, in collaboration with Professor Alison Macfarlane, Ms Nirupa Dattani, Dr Miranda Dodwell, Mr Rod Gibson, Dr Gill Harper, Dr Peter Martin and Dr Mary Newburn)

• Speaker: Mario Cortina Borja (UCL)

Abstract: Maternity care has to be available 24 hours a day, seven days a week. It is known that obstetric intervention can influence the time of birth, but no previous analysis at a national level in England has yet investigated in detail the ways in which the day and time of birth varies by onset of labour and mode of giving birth. We linked data from birth registration, birth notification, and Maternity Hospital Episode Statistics and analysed 5,093,615 singleton births in NHS maternity units in England from 2005 to 2014. We built statistical models to establish how patterns of timing of birth vary by onset of labour, mode of giving birth and gestational age. We found that the timing of birth by time of day and day of the week varies considerably by onset of labour and mode of birth. These patterns have implications for midwifery and medical staffing.

Wednesday 20 February: The Dai-Freed theorem and anomalies

• Speaker: Iñaki García-Etxebarría (University of Durham)

Abstract: The Dai-Freed theorem provides a bridge between the theory of bordism and Quantum Field Theory (and more specifically, anomalies). I will review how these two areas are related, and then summarise some computations of bordism groups of classifying spaces of Lie groups and cyclic groups that we have performed recently, which are of particular interest for applications to four dimensional physics.

Wednesday 20 February: Axions and X-ray polarimetry

• Speaker: Francesca Day (DAMTP)

Abstract: Axions are one of the best motivated extensions to the Standard Model, both solving the strong CP problem and providing a natural dark matter candidate. X-Ray telescope observations have already placed world leading bounds on the axion-photon coupling by searching for axion-photon interconversion in the magnetic fields of galaxy clusters. However, current X-ray telescopes are unable to exploit one of the most striking features of this effect: only photons polarised parallel to the background magnetic field mix with axions. This leads to distinctive polarisation signatures from astrophysical sources. The next generation of polarising X-ray telescopes could detect these signatures. Francesca will discuss the opportunities and difficulties of detecting axions with X-ray polarimetry.

Friday 22 February: Higher length-twist coordinates for character varieties

• Speaker: Omar Kidwai (Toronto)

Abstract: We describe joint work with L Hollands on a construction of special holomorphic Darboux coordinates on certain SL_N (particularly N=2,3) character varieties. We consider degenerate examples of “spectral networks” of Gaiotto-Moore-Neitzke (certain graphical objects on a Riemann surface), generalising the so-called “Fenchel-Nielsen” networks of Hollands-Neitzke. We compute the associated “spectral coordinates” using the “abelianisation map”, taking connections on the Riemann surface to abelian holonomy data on a spectral cover, generalising the “complexified Fenchel-Nielsen” coordinates of Kourouniotis-Tan for SL(2)-connections to higher rank. Time permitting, Omar will discuss some physical applications to computing superpotentials coming from 4d N=2 supersymmetric QFTs.

Wednesday 20 March: $\zeta$-regularized vacuum expectation values

• Speaker: Tobias Hartung (Kings College London)

Abstract: Computing vacuum expectation values is paramount in studying Quantum Field Theories (QFTs) since they provide relevant information for comparing the underlying theory with experimental results. However, unless the ground state of the system is explicitly known, such computations are very difficult and Monte Carlo simulations generally run months to years on state-of-the-art high performance computers. Additionally, there are various physically interesting situations, in which most numerical methods currently in use are not applicable at all (e.g., the early universe or settings requiring Lorentzian backgrounds). Thus, new algorithms are required to address such problems in QFT. In recent joint work with K. Jansen (NIC, DESY Zeuthen), Tobias has shown that $\zeta$-functions of Fourier integral operators can be applied to regularise vacuum expectation values with Euclidean and Lorentzian backgrounds and that these $\zeta$-regularised vacuum expectation values are in fact physically meaningful. In order to prove physicality, we introduced a discretisation scheme which is accessible on a quantum computer. Using this discretisation scheme, we can efficiently approximate ground states on a quantum device and henceforth compute vacuum expectation values. Furthermore, the Fourier integral operator $\zeta$-function approach is applicable to Lattice formulations in Lorentzian background.

Wednesday 27 March: Virtual Source Method Simulation of Progressive Water Waves

• Speaker: Omar Al-Tameemi (Plymouth)

Abstract: The virtual source method (VSM) has been developed to simulate water waves based upon the solution of Laplace’s equation for the velocity potential integral equations with full nonlinear surface conditions. The basis of the method is the use of specific Green’s functions for a rectangular ‘virtual domain’ which is an extension of the physical domain. The solution variables are frequency components of the velocity potential at the upper virtual boundary and these are found by specifying appropriate conditions on the physical boundaries. This talk presented the results of the VSM simulations to generate nonlinear progressive waves in a numerical wave tank. The VSM results were compared with those from both second order Stokes theory and from a boundary element method (BEM).

Wednesday 27 March: Gauge ambiguities in ultrastrong coupling QED

• Speaker: Ahsan Nazir (Manchester)

Abstract: Ultrastrong-coupling between two-level systems and radiation is important for both fundamental and applied quantum electrodynamics (QED). Such regimes are identified by the breakdown of the rotating-wave approximation, which applied to the quantum Rabi model (QRM) yields the apparently less fundamental Jaynes-Cummings model (JCM). We show that when truncating the material system to two levels, each gauge gives a different description whose predictions vary significantly for ultrastrong-coupling. QRMs are obtained through specific gauge choices, but so too is a JCM without needing the rotating-wave approximation. Analysing a circuit QED setup, we find that this JCM provides more accurate predictions than the QRM for the ground state, and often for the first excited state as well. More generally, even in the absence of two-level approximations, gauge-freedom implies that there are many different definitions of light and matter as quantum subsystems, which only coincide when interactions vanish. Considering time-dependent light-matter interactions, we show that in the absence of an argument to choose a particular gauge when promoting the coupling parameter to a time-dependent function, the description that results is essentially ambiguous. For sufficiently strong and non-adiabatic (i.e. fast switching) interactions, the qualitative physical predictions of final subsystem properties, such as entanglement and photon number, depend on the gauge chosen. This occurs even when the coupling vanishes at the preparation and measurement stages of the protocol, at which times the subsystems are unique and experimentally addressable.

Wednesday 3 April: Network Time Series

• Speaker: Guy Nason (Royal Statistical Society Vice President for Academic Affairs; University of Bristol)

Abstract: A network time series is a multivariate time series where the individual series are known to be linked by some underlying network structure. Sometimes this network is known a priori, and sometimes the network has to be inferred, often from the multivariate series itself. Network time series are becoming increasingly common, long, and collected over a large number of variables. We are particularly interested in network time series whose network structure changes over time.

Guy described some recent developments in the modeling and analysis of network time series via network autoregressive integrated moving average (NARIMA) process models. NARIMA models provide a network extension to a familiar environment that can be used to extract valuable information and aid prediction. As with classical ARIMA models, trend can impair the estimation of NARIMA parameters. The scope for trend removal is somewhat wider with NARIMA models and we exhibit some possibilities. Guy illustrated the operation of NARIMA modeling on some real data sets.

This is joint work with Kathryn Leeming (Bristol), Marina Knight (York) and Matt Nunes (Lancaster).

Wednesday 3 April: Soliton fission in a fluid of non-uniform depth

• Speaker: Alan Compelli (Cork)

Abstract: A surface water wave over a bed of non-uniform depth is considered. The fluid is incompressible, inviscid and irrotational. The Hamiltonian is determined in terms of wave-only quantities using a Dirichlet-Neumann operator. By introducing an appropriate scaling regime, and considering the bottom to vary slowly, a KdV equation with variable coefficients is derived. A one-soliton solution approaching a ramp on the seabed was then considered and numerical results demonstrated the effect the ramp shape has on the birth of new solitons as the soliton passes over it.

Wednesday 17 April: Smoothed Particle Hydrodynamics (SPH) Modelling of Tsunami Waves Generated by a Fault Rupture

• Speaker: Ruaa Wana (Plymouth)

Abstract: Smoothed Particle Hydrodynamics (SPH) is a meshfree, Lagrangian, particle method. It is particularly well suited to simulating flow problems that have large deformations or contain free surfaces. In this talk we used a single phase weakly compressible SPH model to simulate a dam break flow and the flow that occurs in experimental models of Tsunamis generated by a fault rupture. The experiments have been carried out at the University's COAST laboratory.

Wednesday 8 May: Challenges in ensuring that the evaluation of medical tests helps to improve patient health

• Speaker: Jon Deeks (University of Birmingham)

Abstract: Healthcare organisations around the world need to make recommendations of the choice and provision of medical tests. In 2018, the WHO for the first time, published the Essential Diagnostic Tests (the EDL) listing in vitro diagnostics which should be available around the world to ensure Universal Health Coverage. In doing so they have faced challenges in identifying evidence required to make rational decisions about test selection. Ideally medical tests, as with interventions, should be recommended for use when there is evidence that they do more good than harm. However, medical tests rarely directly improve patient outcomes – rather the medical interventions that are taken consequent on use of a test create benefit for patients – and it is rare to be able to obtain evidence that shows how tests save lives. While evaluations of diagnostic or prognostic tests have typically focused on their accuracy to predict the numbers receiving appropriate or inappropriate treatment, extrapolating from these data to predict overall patient benefit is not always appropriate. In this talk Jon reviewed challenges that healthcare organisations face in identifying evidence required to assess benefits and harms of testing, drawing on experience working with the WHO EDL.

Wednesday 8 May: An introduction to variational integrators

• Speaker: Fernando Jimenez Alburquerque (Oxford)

Abstract: Fernando introduced the basic notions of geometric integration of mechanical systems, naturally described by Lagrangian/Hamiltonian dynamics. The numerical approximation of such dynamics, respecting its underlying geometrical aspects, represents a crucial challenge in modern geometric integration. Variational integrators [MaWe2001], a class of geometric integrators that has received a lot of attention from the mathematical community in the last two decades, are a well-established example of numerical schemes that succeed in such a task, and moreover display a superior performance in some aspects than benchmark numerical integrators. We shall go over their definition and fundamental properties. Finally, Fernando also introduced future challenges of variational integrators when approximating the dynamics of dissipative mechanical systems.

[MaWe2001] J.E.Marsden and M. West: “Discrete mechanics and variational integrators”, Acta Numerica 10, pp. 357-514, (2001).

Wednesday 15 May: A skein-theoretic model for the double affine Hecke algebras

• Speaker: Hugh Morton (University of Liverpool)

Abstract: Hugh illustrated pictorially the use of ${\mathbb Z}[s^{\pm 1}, q^{\pm 1}]$-linear combinations of braids in the thickened torus $T^{2}\times I$ to construct an algebra induced by composing $n$-string braids. Hugh showed, with the help of pictures, that this algebra satisfies the relations of the double affine Hecke algebra $\ddot{H}_{n}$, which was introduced algebraically. The talk finished with a rather speculative plan to include closed curves in our model in an attempt to incorporate earlier work with Peter Samuelson on the Homfly skein of $T^{2}$ into the setting of the algebras $\ddot{H}_{n}$. This was done with an eye on the elliptic Hall algebra and the work of Schiffman and Vasserot, which was discussed very briefly.

Wednesday 22 May: Lattice Boltzmann Modelling of Two-Dimensional Flow in Micro-Channel by Using Moment Boundary Conditions

• Speaker: Zainab Bu Sinnah (Plymouth)

Abstract: The Lattice Boltzmann Method (LBM) has been developed and used to simulate fluid flow problems for a various geometries and boundary conditions. In this seminar, Zainab presented results obtained using a Lattice Boltzmann (LB) model to simulate rarefaction and compressibility effects for two-dimensional flow in a micro-channel. Moment boundary conditions are used to implement Navier-slip boundary conditions on walls and pressure boundaries are used to drive the flow. For the simulations, we use a second-order single relaxation time model and investigate convergence behaviour of the model.

## Previous seminars - 2017/18 academic year

Wednesday 17 January: Acoustic-gravity waves, theory and applications

Abstract: Acoustic–gravity waves (AGWs) are compression-type waves generated as a response to a sudden change in the water pressure, e.g. due to nonlinear interaction of surface waves, submarine earthquakes, landslides, falling meteorites and objects impacting the sea surface. AGWs can travel at near the speed of sound in water (ca. 1500 m/s), but can also penetrate through the sea-floor surface amplifying their speed, which turns them into excellent precursors. “Acoustic–gravity waves” is an emerging field that is rapidly gaining popularity among the scientific community, as it finds broad utility in physical oceanography, marine biology, geophysics, water engineering, and quantum analogues. This talk was an overview on AGWs, with emphasis on recent developments, current challenges, and future directions.

Wednesday 31 January: The Dawn of FIMP Dark Matter

• Speaker: Tommi Tenkanen, QMUL

Abstract: Tommi presented an overview of scenarios where the observed Dark Matter (DM) abundance consists of Feebly Interacting Massive Particles (FIMPs), produced non-thermally by the so-called “freeze-in” mechanism. In contrast to the usual freeze-out scenario, frozen-in FIMP DM interacts very weakly with particles in the visible sector and never attained thermal equilibrium with them in the early Universe. This makes frozen-in DM very difficult but not impossible to test. In this talk Tommi presented the freeze-in mechanism and its variations previously considered in the literature, compare them to the standard DM freeze-out scenario, discuss several aspects of model building, and pay particular attention to observational properties of such feebly interacting DM.

Wednesday 21 February: Smoothed Particles Hydrodynamics Modelling in Free Surface Single Phase Flows

• Speaker: Ruaa Wana, Plymouth

Abstract: Smoothed Particle Hydrodynamics (SPH) is a meshfree, Lagrangian, particle method. It is particularly well suited to simulating flow problems that have large deformations or contain free surfaces. This talk discussed the SPH single phase simulations of a dam break flow and a tsunami wave generated by a fault rupture.

Wednesday 21 February: Recent results from LHCb

Abstract: The LHCb experiment at CERN has collected an unprecedented data set in beauty and charm decays. The resulting precision measurements are sensitive to physics at mass scales far beyond the “energy frontier”, i.e. the highest collision energies achievable in colliders. Recent results indicate that these data are beginning to challenge the SM of particle physics. This seminar summarised recent LHCb results, with focus on precision measurements of Charge-Parity violation and rare decays.

Wednesday 28 February: Charge Propagation

• Speaker: David McMullan, Plymouth

Abstract: Charge propagation is surprisingly difficult to describe in quantum field theory. Even in the abelian theory, where the spectre of confinement is avoided, the persistent interaction of the electron with its environment leads to deep theoretical problems. There is, though, an exactly solvable model of an electron in a plane wave background (the so called Volkov solution) that provides an attractive arena for testing ideas related to the more general propagation problem. David reviewed some of the recent results we have derived concerning charges propagating in an elliptical class of backgrounds, and outline how to incorporate loop corrections. Some of the lessons learnt from this were discussed in the context of the more general problems that arise when charges propagate.

Thursday 8 March: Rigidity and global rigidity of graphs

• Speaker: Bernd Schulze, Lancaster

Abstract: Rigidity theory is concerned with the rigidity and flexibility analysis of bar-joint frameworks and related constraint systems of geometric objects. This area has a rich history which can be traced back to classical work of Euler, Cauchy and Maxwell on the rigidity of polyhedra and skeletal frames. Since Laman’s celebrated result from 1970 (which provided the first combinatorial characterisation of generic rigid bar-joint frameworks in the plane), rigidity theory has received steadily increasing attention and it is now a highly diverse and thriving research area with many practical applications. Bernd gave an introduction to rigidity theory, concentrating on results and problems for bar-joint frameworks, but also describing how these have been extended to some other types of frameworks. Moreover, Bernd summarised some recent progress in the rigidity analysis of symmetric frameworks.

Tuesday 20 March: Implementation of a Lattice Boltzmann Method for Multiphase Flows with High Density and Viscosity Ratios

• Speaker: Norjan Jumaa, Plymouth

Abstract: We presented a Lattice Boltzmann Method (LBM) for multiphase flows with high viscosity and density ratios. Following Banari et al. (2014), the motion of the interface between fluids is modelled by solving the Cahn-Hilliard (CH) equation with LBM. Incompressibility of the velocity fields in each phase is imposed by using a pressure correction scheme. We used a unified LBM approach with separate formulations for the phase field, the pressure-less Navies-Stokes (NS) equations and the pressure Poisson equation required for correction of the velocity field. The implementation has been verified for various test cases. Here, we presented results for some complex flow problems including two dimensional single and multiple mode Rayleigh-Taylor Instability (RTI). Also, we presented the evolution of the height of a standing wave for both high and low viscosity and density ratios.

Wednesday 21 March: Multi-state models for observed and latent cognitive function in the older population

• Speaker: Ardo van den Hout, Department of Statistical Science, University College London

Abstract: Due to the ageing population there is a growing interest in the statistical modelling of cognitive function in old age. When analysing longitudinal data on ageing, lost to follow-up because of death cannot be ignored. One option is to model survival and change of cognitive function jointly by specifying submodels for the two processes and linking these models by individual-specific random effects. Another option – and the topic of this seminar – is to use a continuous-time multi-state survival model where a series of living states is defined by the level of cognitive function and an additional dead state is included. This multi-state approach is quite general and can be used in many other applications in biostatistics, social statistics, and demography.

The seminar started with an introduction to the continuous-time multi-state survival model by discussing model specification and maximum likelihood estimation. The second part presented an extension of current methods: a hidden Markov model for modelling bivariate cognitive function. The methods were illustrated by using longitudinal data from a UK survey of the older population.

Wednesday 21 March: Accuracy and Stability of Virtual Source Method for Numerical Simulations of Nonlinear Water Waves

• Speaker: Omar Al-Tameemi, Plymouth

Abstract: The Virtual Source Method (VSM) is based upon the integral equations derived by using Green’s identity with Laplace’s equation for the velocity potential. The velocity potential within the fluid domain is completely determined by the potential on a virtual boundary located above the fluid, avoiding the need to evaluate the singular integrals normally associated with integral equation methods. This talk presented numerical simulations of non-linear standing waves and sloshing problems using VSM. We discussed stability and convergence of the method as well as looking at global and energy and volume conservation.

Wednesday 21 March: Simulating dipole wall collisions with slip boundaries by using moment-based boundary conditions for lattice Boltzmann method

• Speaker: Seemaa Mohammed, Plymouth

Abstract: The accuracy of moment-based boundary conditions for no slip and slip walls in the lattice Boltzmann method is examined numerically by using dipole wall collision for both normal and oblique cases. To do that hydrodynamic moments impose to the boundary instead of the distribution functions. For accurate results a two relaxation time (TRT) model are used with moment boundary conditions. An excellent agreement with a benchmark data is obtained and the results are obtained to converge with second order accuracy.

Wednesday 25 April: Integrability of Relativistic Dynamical Systems

• Speaker: Tom Heinzl, Plymouth

Abstract: Tom gave a brief overview of our recent studies of relativistic dynamical systems. These are described by the Newton-Einstein-Lorentz equation generalising F = ma in the presence of external (electromagnetic) forces. Tom explained how space-time and conformal symmetries may be employed to find conserved quantities that lead to integrability. In quite a few cases, the number of these quantities is surprisingly large implying super-integrability. This feature is somewhat ‘exotic’ due to its rare occurrence. For instance, by Bertrand’s theorem, in classical mechanics it only holds for the Kepler problem (with its Laplace-Runge-Lenz vector) and the harmonic oscillator. Using our generalised setting we have been able to extend this list significantly.

Wednesday 25 April: Double copy on plane wave backgrounds

Abstract: Double copy is a method for constructing (perturbatively) scattering amplitudes in gravity from scattering amplitudes in gauge theory, and even has a catchy slogan: Gravity = (Gauge theory)^2. Despite is many uses, the status of double copy as a fundamental feature of perturbative field theories is poorly understood. In this talk Tim described how the robustness of double copy can be tested by considering the simplest scattering amplitudes on plane wave backgrounds. This makes use of several tools familiar from the study of QED in strong background fields. Despite various subtleties introduced by the non-trivial scattering background, it is clear that a notion of double copy does indeed exist.

Wednesday 25 April: Trust in Numbers

• Speaker: Professor Sir David Spiegelhalter, President of the Royal Statistical Society for 2017-18

Abstract: Those who value quantitative and scientific evidence are faced with claims both of a reproducibility crisis in scientific publication, and of a post-truth society abounding in fake news and alternative facts. Both issues are of vital importance to statisticians, and both are deeply concerned with trust in expertise. By considering the ‘pipelines’ through which scientific and political evidence is propagated, David considered possible ways of improving both the trustworthiness of the statistical evidence being communicated, and the ability of audiences to assess the quality and reliability of what they are being told. There were also cheap laughs at numerous examples of disastrous communication of statistics in the media.

Wednesday 2 May: Guiding laser-produced fast electrons using super-strong magnetic fields

• Speaker: Kate Lancaster, York

Abstract: Currently we are able produce with laser plasma interactions some of the most extreme conditions on earth. When ultra-intense lasers are focused on to solid material, the fields associated with the laser are so strong that electrons can easily escape the atoms in the material. Absorption of the laser pulse results in the generation of a population of relativistic electrons, with currents on the order of Mega Amps. The physics associated with how the electrons are produced and subsequently transported in plasma is complex and proves challenging to diagnose and study. Importantly, these fast electrons are the driver for much of the subsequent physics during these interactions including generation of energetic particles/ photon sources, unique atomic physics states such as hollow atoms, hydrodynamic phenomena, production of warm / hot dense matter relevant to stellar interiors, heating of matter relevant to alternative laser driven fusion schemes such as fast ignition, and conditions relevant for understanding of nuclear astrophysics in the most extreme objects in our universe.

This talk illustrated some of the experiments happening on petawatt-class lasers concerning how to control important fast electron beam parameters (such as divergence) using novel structured targets. Alex Robinson et al first proposed using targets incorporating a resistivity gradient to confine fast electrons. At the material interface of a high resistivity feature, e.g. a wire, surrounded by a lower resistivity material a strong magnetic field is generated which confines electrons to areas of higher resistivity and higher current density. In this talk experiments using targets with novel silicon embedded features created by Scitech Precision Ltd using MEMS technology were presented. A novel duel channel front surface imaging system was created in order to enable both pre-shot alignment and on-shot focal spot position, information critical for performing these types of complex experiments.

Wednesday 9 May: Searching for dark matter with the LUX and LUX-ZEPLIN direct detection experiments

• Speaker: Jim Dobson, UCL

Abstract: For the past decade liquid xenon time-projection chambers (TPCs) hidden deep underground have led the race to make a first direct detection of dark matter here on Earth. Jim presented the final results from the recently completed Large Underground Xenon (LUX) experiment as well as the status and physics reach of its successor, the 40 times as massive LUX-ZEPLIN experiment currently being constructed and due to start data taking in 2020.

Wednesday 16 May: Progress on the connection between spectral embedding and network models used by the probability, statistics and machine-learning communities

• Speaker: Patrick Rubin-Delanchy, University of Bristol

Abstract: Patrick gave theoretical and methodological results, based on work spanning Johns Hopkins, the Heilbronn Institute for Mathematical Research, Imperial and Bristol, regarding the connection between various graph spectral methods and commonly used network models which are popular in the probability, statistics and machine-learning communities. An attractive feature of the results is that they lead to very simple take-home messages for network data analysis: a) when using spectral embedding, consider eigenvectors from both ends of the spectrum; b) when implementing spectral clustering, use Gaussian mixture models, not k-means; c) when interpreting spectral embedding, think of “mixtures of behaviour” rather than “distance”. Results are illustrated with cyber-security applications.

Wednesday 23 May: Flexible inference for continuous-time models of wildlife movement

• Speaker: Paul Blackwell, University of Sheffield

The majority of statistical models of animal movement are formulated in discrete time, modelling separately each 'step' from one location (e.g. GPS fix) to the next. This can make it difficult to deal with missing or unequally-spaced observations, to compare studies with different time scales, or to interpret results biologically. In reality, animals exist and move in continuous time, and Paul described some switching diffusion models that try to capture some of the complexities of real behaviour in continuous time. Computational cost is an increasingly important issue in fitting movement models, and Paul talked about some algorithms that allow exact inference for such models, even in the presence of spatial heterogeneity, borrowing ideas from Hidden Markov Models and Kalman Filtering. An increasingly important area of application is collective movement, where we model the locations of simultaneously-tracked animals as they interact; Paul discussed some recent developments in modelling and computation for this situation.

Some of this work is joint with Mu Niu (University of Plymouth) and a number of recent or current research students at the University of Sheffield.

Wednesday 23 May: Analytical estimates of proton acceleration in laser-produced turbulent plasma

• Speaker: Konstantin Beyer (Oxford)

Abstract: With the advent of high power laser facilities new opportunities for scaled laboratory experiments of astrophysical processes have become available. Here we showed that experiments at National Ignition Facility (NIF) laser have reached the condition where second order Fermi acceleration can be directly investigated with the available diagnostics. This requires measuring the diffusion of 3 MeV protons produced within a turbulent plasma. Since Fermi acceleration is essentially a biased diffusion process and using existing solutions, we show that a significant broadening of the initial proton distribution is expected for those particles existing the plasma.

Thursday 31 May: Symplectic geometry of the moduli space of framed Higgs bundles

• Speaker: Marina Logares, University of Plymouth

Abstract: Let X be a compact Riemann surface and D an effective divisor on X. The moduli space of D-twisted stable Higgs bundles on X is known to have a holomorphic Poisson structure which is in fact symplectic if and only if D is the zero divisor. We proved that this moduli space admits a natural enhancement to a holomorphic symplectic manifold. This is based on joint work in collaboration with I Biswas and A Peón-Nieto.

Wednesday 6 June: Research students using statistics mini-symposium

The influence of external peer reviewers on funding decisions in grant applications in NIHR

• Speaker: Lexy Sorrell, University of Plymouth

Abstract: The National Institute of Health Research (NIHR) is a large, long established national funder of health research, aiming to select applications for research funding which are of the highest quality and address important health issues, providing evidence for policy and practice. However, this selection process is time consuming and costly in terms of human resources: the NIHR staff, external reviewers and board members. This study is part of the wider ‘Push the Pace’ project aiming to reduce the time for research to get from ‘bench to beside’, which is currently averaging ten years in the NIHR research pathway. We used the external reviewers’ scores along with reviewer and application characteristics to investigate the influence of external peer reviewers on the funding decisions made by the board, how many reviewers are needed to review an application; and the relative value of peer review scores from different kinds of reviewers.

Sonification: The Aesthetics of Listening to Data

• Speaker: Nuria Bonet Filella, University of Plymouth

Abstract: Sonification is ‘the use of non-speech audio to convey information’; in other words, the aural display of data (as opposed to visual). It can be a superior approach for the understanding of data, where their visual interpretation is difficult. Sonification has mainly been used for scientific purposes (for example Geiger counter, EEG monitor or heart rate monitor) but composers are showing increasing interest in the method.

Musical aesthetics are crucial to sonification. In order to be effective in transmitting data, a sonification must be clear, appropriate to the data, and aesthetically pleasing to the listener. Many examples are ‘typically unpleasant and lacking any natural connection to the data represented’. Scientists might not have the musical knowledge to create a good sonification; composers might not have the scientific knowledge to deal with the data musically. Therefore, the production of sonification has strongly increased in recent years but their quality has not.

Nuria has approached sonification from a compositional point of view in order to establish an aesthetic framework for aural display. Through a portfolio of sonifications and theoretical work, Nuria has explored the practical aspects of listening to data, whether for scientists or composers.

Friday 8 June: Thick subcategories of discrete derived categories

• Speaker: Nathan Broomhead, University of Plymouth

Abstract: Nathan explained some work in which he describes the lattices of thick subcategories of discrete derived categories. This is done using certain generating collections of exceptional and sphere-like objects related to non-crossing configurations of arcs in a geometric model.

Wednesday 13 June: Probabilistic Regression Analysis of Extreme Events in Energy Sectors with Either Massive or Small Data

• Speaker: Keming Yu, Brunel University

Abstract: Buried pipelines and wind turbine are important devices to convert or transfer energy. Buried pipelines are vulnerable to the threat of corrosion. Of interest is an estimate of the probability when or where an affected pipeline is likely to fail from the extreme growth of a corrosion defect. Wind turbine monitoring uses acoustic emission signal detection of damage processes in the structure. Peak signals are something under the concern of the industry. Of interest is an estimate of the probability of a signal beyond a threshold. Many factors involved need to be taken into consideration when building a probabilistic model for these extreme events. But some classical regression models such as the logistic regression whose response is a binary variable seems inefﬁcient for an observable continuous-response. Furthermore, the probit regression may face either massive data to process or small size to apply, depending on different cases. Whatever the case, estimation accuracy with the support of sound statistical theory and computational algorithm is expected. This talk introduced a novel inference of probit regression for extreme events to cope with either massive streaming or small size data and showed that this objective may be achievable.

Friday 22 June: The history of inflation measurement

• Speaker: Jeff Ralph (RSS William Guy Lecturer for 2017-2018), Office of National Statistics
Abstract: The monthly consumer price inflation figures from the Office for National Statistics are among the most influential of all Official Statistics. They are used for a wide variety of important purposes from indicating the health of the economy to the adjustment of pensions and benefits. Behind the numbers sits a sophisticated methodology that has been developed over a long period of time. This talk looked at the development of these measures from their origins at the start of the 18th century to the current day. It identifed some of the visionary individuals who contributed to establishing the foundations, the dates when important changes were made and the social and political factors that drove the developments.

Monday 16 July: Xmeta: a comprehensive tool-box for advanced meta-analysis - meta-analysis, publication bias, and beyond

• Speaker: Yong Chen, University of Pennsylvania

Abstract: Yong introduced a web-platform that they have been developing over the last few years, known as ‘xmeta’, which aims to facilitate advanced comprehensive meta-analysis for applied investigators within and beyond the Perelman School of Medicine at University of Pennsylvania. The functionality of this platform includes implementation of various multivariate meta-analysis for continuous and/or binary outcomes in randomised controlled trials, meta-analysis of diagnostic tests with/without gold standard, as well as methods to identify and correct for publication biases or small study effects. Yong also introduced the online-analysis feature, which allows applied investigators to conduct these analyses without programing by themselves. At the end, Yong talked about the future direction of incorporating semi-automated text mining to speed up the systematic review process.

Monday 23 July: Methodological advances in evidence synthesis

• Speaker: Orestis Efthimiou, University of Bern

Abstract: Network meta-analysis (NMA) is an extension of the usual (pairwise) meta-analysis. It is a statistical tool for synthesizing evidence obtained from studies comparing multiple competing interventions for the same disease. In this lecture, we went through some recent advances in the field. First, we discussed a new model for the NMA of binary outcomes. This model generalises the well-known Mantel-Haenszel method, and can be especially valuable for the case of rare events, e.g. when synthesising data on mortality or serious adverse events. The method has been implemented in R in freely available, easy-to-use routines. Second, we discussed models for including non-randomised studies in NMA. Non-randomised studies can reveal whether or not interventions are effective in real-life clinical practice and there is a growing interest in including such evidence in the decision-making process. Here we present and compare an array of alternative methods, and we apply some of the methods in previously published clinical examples. Finally, we discussed methods for individual participant data network meta-analysis (IPD-NMA). IPD are considered the gold standard in evidence synthesis, and inclusion of IPD in NMA offers unique advantages, such as increase in precision, decrease in heterogeneity, as well as the capacity to individualise the treatment according to a patient’s characteristics. We showcased our methods using an example from depression.

Note: This was a joint RSS South West Local Group and Exeter Health Statistics event.

Event photography and video
Please be aware that some of the University of Plymouth's public events may be attended by University photographers and videographers, for capturing content to be used in University online and offline marketing and promotional materials, for example webpages, brochures or leaflets. If for whatever reason, you or a member of your group, do not wish to be photographed, please make yourself known to staff working at the event on arrival or to the photographer.