Stochastic wave modelling for inhomogeneous sea-states
- Funding: £166,508
- Funding agency: EPSRC
- Investigators: Stuhlmeier, R. (Pl, Plymouth)
- Duration: Feb 2021–July 2023
- Details of grant
Centre for Mathematical Sciences at the University of Plymouth – research grants information
These energy exchanges, together with wave-breaking and wind, are the main inputs into wave-forecasting models. Such models inform the surfer waiting for a big swell as well as the engineer planning offshore operations by providing an accurate and timely forecast of wave conditions. Moreover, equations describing the evolution of the sea-state can be used to predict the likelihood of anomalously high rogue waves, which have been implicated in shipping accidents since antiquity.Current modelling relies on equations that assume waves to be uncorrelated, implying that the sea is spatially homogeneous. This classically results in an equation with a very long evolution timescale and excludes phenomena of physical and mathematical interest. This project is devoted to studying how correlation between wave modes impacts the evolution of wave fields to develop novel equations that can be implemented in wave forecasting systems.
While studies have been conducted to predict graft failure following kidney transplantation, they did not focus on patient survival and were based on a limited set of variables. There is a clinical need to develop new statistical methods using big data to better predict graft and patient survival in transplant recipients.This project brings the opportunity to seek to use the linked registry data from national databases to develop and validate clinical prediction models for survival outcomes. The project will aim to integrate data from multiple sources, develop models to predict risks of graft failure and death over time, and conduct an internal and external validation of the developed prediction models.
Recent UK experiments, in which beams of ultra-relativistic electrons were
collided with intense laser pulses, have shown that it is possible not only to
use intense lasers to probe fundamental physics, but also to generate radiation
sources with unique properties, which find applications across the sciences.
Such experiments are extremely challenging, and despite recent successes there
is disagreement over to what extent quantum effects have been observed.
Discrepancies between experimental results and theoretical predictions have
been attributed to the numerical models of quantum effects employed in
Particle-In-Cell (PIC) codes used to simulate and analyse experiments.
A host of new experiments will begin this year, and will be able to probe the transition from classical to quantum physics in intense electromagnetic fields. It is therefore critical that we improve our understanding of theoretical models, and their implementations, in order to ensure that theoretical predictions and analyses keep up with experimental progress.
To meet this urgent experimental demand we propose developing existing theory on two fronts.
On one front, we will extend existing models to include currently neglected processes (such as absorption and trident pair production) in a systematic way that can be immediately employed by simulators. On the second front, we will analyse a number of quantum effects which cannot be captured by existing numerical models (but which become relevant in e.g. the overlapping field geometries of future facilities, or in dense electron bunches), assess their importance to experimental campaigns, and develop a methodology to implement them numerically, going beyond current models.
Doing so requires a team of researchers who are not only experts in the theory of quantum effects in intense laser physics, but who also have the experience required to understand numerical implementation and experimental analyses. This is not a case of benchmarking existing codes, already well-covered in the literature. What is needed, rather, is a "top down", approach which can verify, and improve upon, the models of quantum effects which are used in the codes.
Plymouth hosts an established, world-leading research group in the area of intense laser-matter interactions. Staff members are research-active and well-known in the community as experts in the theory of quantum effects in intense laser physics. Furthermore, the Investigators attached to this project are actively involved in experimental efforts, being for example part of the team which recently demonstrated radiation reaction in laser-matter collisions in an experiment at the UK's Central Laser Facility.
As such the Investigators have precisely the right skillset to undertake this timely project and deliver new results of import to a wide community of physicists. This will help maintain the UK's world-leading capabilities in the active research area of intense laser-matter interactions.
Lattice Field Theory (LFT) provides the
tools to study the fundamental forces of nature using numerical simulations.
The traditional realm of application of LFT has been Quantum Chromodynamics
(QCD), the theory describing the strong nuclear force within the Standard Model
(SM) of particle physics. These calculations now include electromagnetic
effects and achieve sub percent accuracy. Other applications span a wide range
of topics, from theories beyond the Standard Model, to low-dimensional strongly
coupled fermionic models, to new cosmological paradigms. At the core of this
scientific endeavour lies the ability to perform sophisticated and demanding
numerical simulations. The Exascale era of High Performance Computing therefore
looks like a time of great opportunities.
The UK LFT community has been at the forefront of the field for more than three decades and has developed a broad portfolio of research areas, with synergetic connections to High-Performance Computing, leading to significant progress in algorithms and code performance.
Highlights of successes include: influencing the design of new hardware (Blue Gene systems); developing algorithms (Hybrid Monte Carlo) that are used widely by many other communities; maximising the benefits from new technologies (lattice QCD practitioners were amongst the first users of new platforms, including GPUs for scientific computing); applying LFT techniques to new problems in Artificial Intelligence.
The research programme in LFT, and its impact, can be expanded in a transformative way with the advent of pre-Exascale and Exascale systems, but only if key challenges are addressed. As the number of floating point operations per second increases, the communications between computing nodes are lagging behind, and this imbalance will severely affect future LFT simulations across the board.
These challenges are common to all LFT codebases, and more generally to other communities that are large users of HPC resources. The bottlenecks on new architectures need to be carefully identified, and software that minimises the communications must be designed in order to make the best usage of forthcoming large computers. As we are entering an era of heterogeneous architectures, the design of new software must clearly isolate the algorithmic progress from the details of the implementation on disparate hardware, so that our software can be deployed efficiently on forthcoming machines with limited effort.
The goal of the EXA-LAT project is to develop a common set of best practices, KPIs and figures of merit that can be used by the whole LFT community in the near future and will inform the design and procurement of future systems. Besides the participation of the LFT community, numerous vendors and computing centres have joined the project, together with scholars from 'neighbouring' disciplines. Thereby we aim to create a national and international focal point that will foster the activity of scholars, industrial partners and Research Sotfware Engineers (RSEs). This synergetic environment will host training events for academics, RSEs and students, which will contribute to the creation of a skilled work force immersed in a network that comprises the leading vendors in the subject.
EXA-LAT will set the foundations for a long-term effort by the LFT community to fully benefit of Exascale facilities and transfer some of the skills that characterise our scientific work to a wider group of users across disciplines.
Research in particle physics and cosmology
connects the largest scales, those of the Universe as a whole, with then
smallest, namely those of fundamental particles. By trying to understand how
the Universe evolved after the Big Bang, we may gain insight into which
particles are yet to be discovered, e.g. at the Large Hadron Collider (LHC),
and vice versa.
Concerning the early Universe, it is commonly understood that it underwent a period of rapid expansion, called inflation. However, many open questions remain. For instance, what is the mechanism of cosmological inflation, and, can we link inflation to quantum gravity, a theory that still eludes us? Interestingly, the recent observations of gravitational waves may provide a guide here. Inflation predicts a gravitational-wave background with properties depending on the details of the inflationary model. Hence if this background is observed, it may help us to further uncover details of the inflationary epoch after the Big Bang. Gravitational waves may also shed light on other puzzles, namely those related to dark energy and dark matter. Again, possible alternative theories to Einstein's general theory of gravity, which are designed to solve the dark energy/matter puzzles, may leave their imprint in gravitational waves.
In contrast to this, the LHC probes the smallest length scales, by colliding protons and nuclei at very high energies. In order to test the Standard Model (SM), our current highly successful theory of elementary particles, to the extreme, it is necessary to compute SM processes to high precision, and make predictions of physics beyond the Standard Model (BSM). The former can be done using advanced techniques which go beyond the usual Feynman diagrams. For the latter, one may take the viewpoint that the SM is an effective field theory (EFT), valid up to a certain energy scale only. To understand which novel BSM interactions can give rise to the SM at low energies, without conflicting with high-precision from the LHC, is an outstanding challenge. Two main classes of candidate theories are so-called near-conformal gauge theories and Composite Higgs models, which both give rise to electroweak symmetry breaking and a light Higgs boson. They may even provide dark matter candidates.
These theories have a commonality with the theory of quarks and gluons, Quantum Chromodynamics (QCD), namely that they are strongly interacting. This implies that they cannot be solved easily analytically, but are amenable to numerical simulations on high-performance computing facilities. The study of QCD provides a link between the physics of the early
Universe and elementary particles. Namely, as the Universe cooled down after the Big Bang, it underwent a series of phase transitions. During one of those, quarks and gluons combined into hadrons, i.e. the particles we observe today. The QCD phase transition is currently being explored at the LHC, by colliding heavy ions, motivating quantitative predictions on how the QCD spectrum changes with temperature. In fact, even understanding the QCD spectrum in vacuum is still partly unsolved and may guide toward BSM physics.
Quantum field theories (QFTs) describes physical processes across a vast range of energy scales, from fundamental interactions, as mentioned above, to low-dimensional and condensed matter systems. Many new phenomena and the detailed structure of QFTs are anticipated to lie beyond the confines of traditional perturbative methods or numerical simulations. Dualities provide links between hitherto unrelated theories, making tractable questions previously considered to be out of reach. With new dualities being discovered, the richness of QFT is larger than naively expected. Similarly, dynamics out of thermal equilibrium, the process of thermalisation, or the evolution of quantum information, relevant for black hole dynamics, benefits from new approaches, some of which are motivated by quantum information.
The potential of accessing a new physical regime within QED has already spurred interest in approaching the regime experimentally. But the conjectured breakdown of perturbation theory points to ﬂaws in our understanding of theoretical methods at high intensity. Since we currently have no reliable theoretical tools, the results of our research will be signiﬁcant for theorists working on both particle and laser physics, and experimentalists and simulators working on laser-matter interactions.
Research in particle physics and cosmology
connects the largest scales, those of the universe as a whole, with the
smallest, namely those of fundamental particles and strings. By trying to
understand how the universe evolved after the Big Bang, we may gain insight
into which particles are yet to be discovered at e.g. the Large Hadron Collider
at CERN, and vice versa, a fascinating prospect!
It is commonly assumed that the early universe went through a period of rapid expansion, dubbed inflation. The mechanisms underlying inflation can be investigated in a number of ways. In the so-called bottom-up approach, one aims to find predictions that are independent of details of models, but only depend on symmetries and the nature of the source of inflation. It is then possible to extract universal features leading to observational predictions and point towards physics beyond our currently known Standard Models of Particle Physics and Cosmology. In the complementary top-down approach, one starts with the given theory, e.g. one that is motivated by string theory, and derives its consequences, which, again might be testable by observations. These approaches can also be used to study the period of cosmic acceleration our universe is currently going through, i.e., dark energy.
String theory is a theory of gravity (and other forces) operating at very high-energy scales. Besides its possible role as a fundamental theory, it has many intricate aspects which require a level of understanding deeply rooted in symmetries and dualities (a transformation that leads to two 'dual' formulations which are superficially very different but yet equivalent). By studying those, one may not only understand string theory better, but also arrive at dual theories which are relevant for e.g. physics beyond the Standard Model (BSM) probed at the LHC, especially if the BSM model is strongly coupled.
In order to make predictions for the LHC, it is necessary to perform very precise calculations, in BSM models and in the Standard Model itself. Some of these calculations can be done by expanding in a small parameter. This does not mean that the computation is easy though, since many scattering processes may contribute. However, it might be that by re-organising these contributions a new, more efficient, formulation can be found.
When there is no small parameter, a theory has to be solved as it stands. Often this can be attempted numerically, by formulating it on a space-time lattice. Since this involves very many degrees of freedom, typically one has to employ the largest supercomputers in the world. The theory of the strong interaction, Quantum Chromodynamics (QCD), is one of those theories in which a small parameter is absent. Although it is formulated in the terms of quarks (as matter particles) and gluons (as force carriers), these are not the particles that appear in the spectrum, which are instead protons, neutrons, pions etc. However, since QCD is so hard to solve, there may be other particles not yet detected and also not yet understood theoretically: examples are so-called glueballs and hybrid mesons. By studying QCD on the lattice, these ideas can be tested quantitatively.
A related question concerns what happens with all these particles when the temperature (as in the early universe) or the matter density (as in neutron stars) is increased. Also this can be studied numerically and a transition to a new phase of matter at high temperature, the quark-gluon plasma, has been observed. Since this phase is currently being explored at the LHC, by colliding heavy ions, quantitative predictions on the spectrum and on transport properties, such as how viscous the plasma is, are needed here as well.
Some BSM models also lack a small parameter and hence are studied using similar lattice computing techniques. By scanning models with distinct features, again hints for the LHC may be found, e.g., with regard to unusual spectral features.
The so-called lattice approach is a very successful first principle method that allows to solve Gauge Theories. Calculations resort to large scale simulations and are typically run on the largest supercomputers. Lattice simulations are a unique tool to explore non perturbative phenomena in theories which are not well understood. In Nature, non perturbative phenomena give rise to the mass of the ordinary proton, whose mass mostly come from the binding energy of its constituents : the quarks. Tremendous efforts are being made to design extensions of the Standard Model of particle physics using similar mechanism that could for instance explain the mass and properties of the Higgs boson. The project aims at making the first prediction of decay rates of resonances relevant for phenomenological analysis beyond the Standard Model using lattice simulations, and will thus provide quantitative results that are relevant for experiments searching for new physics like the one performed by the world's largest accelerator: the Large Hadron Collider (LHC).
The quality of life of patients with severe asthma differs from mild and moderate asthma, partly because of a greater burden of symptoms and risk of exacerbations, but also due to differences in treatment, which can have more pronounced side effects. Existing asthma-specific Health-Related Quality of Life (HRQoL) scales are not optimally designed for severe asthma patients.
This project is a collaborative effort between the University of Plymouth’s Faculty of Health: Medicine, Dentistry and Human Sciences, the School of Psychology, and University Hospitals Plymouth NHS Trust. The team consists of Professor Michael Hyland, Professor Rupert Jones, Joseph Lanario, Lucy Cartwright, Dr Yinghui Wei and Dr Matthew Masoli, who is the clinical lead for asthma at the Royal Devon and Exeter Hospital and has established a regional severe/difficult asthma service.
As the intensity frontier is pushed back in
current and next-generation high power laser facilities (currently under
construction), our understanding of how to convert light to higher frequencies
in a controlled and efficient way and how to convert that radiation into matter
and antimatter is increasing. The proposed research will contribute to this
effort by establishing how these processes are generated in high-intensity,
short laser pulses, allowing predictions from the standard model to finally be
verified, or a deviation to be found.
The process of electron-seeded pair-creation, which forms the subject of the proposal, is a central example of a high-intensity quantum phenomenon. Only a single experiment, E-144, which combined a 47GeV electron beam and a 10^18 W/cm^2 laser pulse, performed two decades ago at the Stanford Linear Accelerator Center, has measured this effect and only in the multiphoton regime. They reported observation of the sequential process of nonlinear Compton scattering to produce high-energy photons and their subsequent decay via the nonlinear Breit-Wheeler process, into electron-positron pairs. If this experiment could be performed again with the higher laser intensities available today, the process is predicted to be nonperturbative. These types of processes are of great interest because they are poorly understood and typically occur in difficult parts of the standard model e.g. confinement in QCD is non-perturbative.
The aim of the proposed programme is to calculate electron-seeded pair-creation in a laser pulse. Although this process has been calculated in a constant and a monochromatic field, there has been no full calculation in a pulsed field. Inclusion of the pulsed nature is essential for accurate experimental predictions in high power laser experiments. In addition to the sequential process measured at E-144, there is also predicted to be a simultaneous process in which the photon remains virtual (often referred to as the ``trident process''). Such virtual processes are currently neglected by QED laser-plasma simulation codes, which are frequently used in the design and analysis of high-intensity experiments.
A main objective of the research is to ascertain to what extent the approximations used in simulation, such as the field being instantaneously constant during the formation of quantum processes, are faithful to the predictions of QED when the duration of the laser pulse is decreased. This will allow for accurate predictions for future experimental campaigns. A further, and related, objective is to establish under what conditions a separation into sequential and simultaneous processes is at all well-defined as the extent of the laser pulse is
reduced where quantum interference plays an ever-larger role. Whilst the approximation of lowest-order dressed processes such as photon decay and nonlinear Compton scattering is well-understood, how to approximate higher-order dressed processes such as electron-seeded pair-creation has yet to be properly investigated.
By working with a project partner who is the principal investigator of an EPSRC-sponsored QED laser-plasma simulation campaign, knowledge-transfer from the research in the form of analytical results and expertise to plasma simulation will be ensured. The final aim of the project is the benchmarking of next-generation numerical codes with analytical results. A main beneficiary will be the high-intensity plasma simulation community and we expect our analysis of approximation to this second-order process to be highly relevant to the simulation of other second-order processes such as double nonlinear Compton scattering in short laser pulses, which become more important as the laser intensity used in experiment increases. In general, the proposed research underpins high power laser science and laser-plasma physics, in line with the UK research portfolio.
The project is a close collaboration between STFC-RAL and 2 universities with significant experience in research into wave interactions with fixed and floating structures working together to combine and apply their expertise to model the problem. The aim is to develop integrated parallel code implemented on a massively multi-processor cluster and mutli-core GPUs providing fast detailed numerical wave tank solutions of the detailed physics of violent hydrodynamic impact loading on rigid and elastic structures. The project is linked to and part of a carefully integrated programme of numerical modelling and physical experiments at large scale. Open source numerical code will be developed to simulate laboratory experiments to be carried out in the new national wave and current facility at the UoP.
It is well known that climate change will lead to sea level rise and increased storm activity (either more severe individual storms or more storms overall, or both) in the offshore marine environment around the UK and north-western Europe. This has critical implications for the safety of personnel on existing offshore structures and for the safe operation of existing and new classes of LNG carrier vessels whose structures are subject to large and at present unquantified instantaneous loadings due to violent sloshing of transported liquids in severe seas. There exist oil and gas offshore structures in UK waters are already up to 40 years old and these aging structures need to be re-assessed to ensure that they can withstand increased loadings in increasingly adverse seas as a result of climate change, and to confirm that their life can be extended into the next 25 years. The cost of upgrading existing structures and of ensuring the survivability and safe operation of new structures and vessels will depend critically on the reliability of hydrodynamic impact load predictions. These loadings cause severe damage to sea walls, tanks providing containment to sloshing liquids (such as in LNG carriers) and damage to FPSOs and other offshore marine floating structures such as wave energy converters.
Whilst the hydrodynamics in the bulk of a fluid is relatively well understood, the violent motion and break-up of the water surface remains a major challenge to simulate with sufficient accuracy for engineering design. Although free surface elevations and average loadings are often predicted relatively well by analysis techniques, observed instantaneous peak pressures are not reliably predicted in such extreme conditions and are often not repeatable even in carefully controlled laboratory experiments. There remain a number of fundamental open questions as to the detailed physics of hydrodynamic impact loading, even for fixed structures and the extremely high-pressure impulse that may occur. In particular, uncertainty exists in the understanding of the influence of: the presence of air in the water (both entrapped pockets and entrained bubbles) where the acoustic properties of seawater change leading to variability of wave impact pressures measured in experiments; flexibility of the structure leading to hydroelastic response; steepness and three dimensionality of the incident wave.
This proposal seeks to improve the current capability to directly attack this fundamentally difficult and safety-critical problem by accelerating state of the art numerical simulations with the aim of providing detailed solutions not currently possible to designers of offshore, marine and coastal structures, both fixed and floating.
Alcohol use disorders (AUDs) and substance use disorders (SUDs) have significant avoidable global human and economic cost. In the UK, AUDs and SUDs cost the economy £21bn (£3.5bn in healthcare) and £15bn (£488m in healthcare), respectively. Pharmacological interventions inherently have complications, and alternative therapies for treatment and prevention are needed. There is a growing interest in the possible role that physical activity (PA) may offer to reduce AUDs and SUDs with minimal or no adverse effects. Little is known about the best way PA can be promoted to both prevent and reduce AUDs and SUDs and NICE guidelines currently make minimal and vague reference to its role1–7. Aims To systematically review the evidence to date in order to describe and quantify the effects of PA on AUDs and SUDs and understand how it is best delivered or promoted, in what setting, and among which populations, to encourage the prevention, reduction, and abstinence from AUDs and SUDs. In the final phase of the study our aim is to elicit the views of service leads and users on how the findings can be used to guide future funding and interventions for reducing progression, use and post-treatment relapse. Plan of investigation A wide selection of electronic databases will be searched based on a list of key words to generate a list of published research (including grey literature and qualitative investigations) which will then be screened independently by two researchers according to a predefined checklist. Eligible studies will be rated for quality and risk of bias, and data on study details, participant characteristics, AUD/SUD related factors, intervention details and setting, control conditions, and outcomes will be extracted by one researcher and checked by another. Data across similar studies will be synthesised in a meta-analysis. Moreover, we will provide a detailed narrative synthesis using tables, diagrams and narrative texts across the studies, interventions, outcomes, populations and settings. Potential benefits to people and NHS The proposed review will present the evidence to date on the role of PA in the prevention, harm reduction and treatment of AUDs and SUDs. After summarising this literature service leads and users will have the chance to reflect on and add further guidance on how PA interventions can be designed to have greatest reach and effectiveness. They will also add to recommendations on how support for PA can be offered within the NHS, and other public health services, and appropriate third sector and charity organisations, with implications for funding. The review will examine the many aspects of PA in how it is delivered, to whom, by whom, and in what setting to present the potentially most effective services. It will provide information for those involved in the treatment and prevention of AUDs and SUDs as to the most effective and cost-effective applications of PA. It will also highlight potential research gaps to allow for future research planning within the NHS. The review will also be presented to the appropriate NICE Guidelines review panel with ideas on how the findings could be incorporated into future revisions of guidance on effective interventions. Should there be a need for further research on the effectiveness and cost-effectiveness of PA interventions for specific groups then we will present the findings to the appropriate NIHR prioritisation panel, and other funders (e.g., National Lottery awards).
The proposal is to establish a new Collaborative Computational Project (CCP) serving the UK re-search community in the area of wave structure interactions (WSI). The new CCP-WSI will bring together computational scientists, Computational Fluid Dynamics (CFD) specialists and experimentalists to develop a UK national numerical wave tank (NWT) facility fully complementary to existing and future UK experimental laboratory facilities for marine, coastal offshore engineering and thus support leading-edge research in an area of high national importance. Substantial progress has been made on a number of past and current EPSRC project grants held by the lead partners in this CCP bid to develop and test the primary elements of a numerical wave tank and to carry out cutting edge wave impact experiments alongside new opensource CFD code development. We believe it is timely to focus the activities of the community on the development of opensource NWT code held within a central code repository (CCPForge). The code will be professionally software engineered and maintainable, tested and validated against measurement data provided by the partner experimentalists, whilst remaining sufficient flexibility to meet the requirements of all members of the WSI community. This model for sharing developments collaboratively within a consortium of partners within a central code repository that is sustainably managed for the future has been developed by the lead partners in related EPSRC funded research projects. The proposed CCP-WSI would extend the framework and methodology for sharing and future proofing EPSRC funded code developments in wave structure interaction to the wider community. This is proposed through a programme of community events and activities which are designed to foster the links between experimentalists and those performing computations, between industry users, academics and the interested public.
Abstract: The standard model of particle physics
encodes our current knowledge of the fundamental constituents of atoms and the
nature of matter in the earliest moments following the Big Bang. However, our
understanding of the dynamics of the standard model is limited by our ability
to solve its strongly-interacting sector, quantum chromodynamics (QCD), which
describes the interactions of quarks and gluons. The Swansea and Plymouth
groups are approaching this problem from two complementary perspectives. By
approximating the continuum of spacetime as a discrete lattice of points, it is
possible to simulate QCD on high performance computers. The groups will study
lattice QCD in the extreme conditions of high temperature and density which
existed following the Big Bang and which can now be realised in heavy-ion
collisions at the Large Hadron Collider (LHC) at CERN. These investigations
will be complemented by analytic insights arising from `gauge-gravity duality',
a remarkable principle which relates the theories describing particle physics
with properties of general relativity.
The primary goal of the LHC is, however, to discover the new physics which is responsible for the generation of mass for the elementary particles. This `electroweak symmetry breaking' is the least understood part of the standard model. It may be due to the existence of a background field permeating spacetime, which gives mass to particles as they interact with it. On the other hand, mass generation may be due to the existence of a new strong interaction at the TeV energy scale
probed by the LHC. In both cases, the theories predict the existence of a new spin zero particle, the famous Higgs boson recently discovered at the LHC. Distinguishing these possibilities is a subtle problem and once again we are attempting to resolve the question using both gauge-gravity duality and lattice simulations.
Particle physicists do not, however, believe that the standard model is the ultimate theory of nature. It is an example of a gauge theory, a theoretical framework which unifies quantum mechanics and special relativity together with the fundamental symmetries which physicists have discovered through decades of experiments with particle accelerators. Meanwhile, gravity remains outside this framework, being described by general relativity in terms of the curvature of spacetime. A deeper unification appears possible with superstrings, which contain both gauge theories and gravity together with a new type of spacetime symmetry known as supersymmetry. The Swansea group is therefore complementing its investigations of LHC physics with research into the deeper structure of gauge fields and strings, using fundamental ideas such as gauge-gravity duality and `quantum integrability' in the search for the underlying principles behind our current theories of particle physics.