Previous Projects

Understanding the potential for month-ahead prediction for monsoon systems 

Supervisor: Professor Andy Turner 

Active and break events in monsoon rainfall, lasting a week or so, represent the extremes of intraseasonal variability, with profound implications for water supply and agriculture.  The Boreal summer intraseasonal oscillation, which varies over 30-60 days, provides the known predictability at the S2S range, but its prediction skill in models is limited.  This project will seek to answer the question: Can monsoon intraseasonal prediction be extended to the one-month lead time?  The approach will be to first reduce the dimensionality of reanalysis data, e.g., using empirical orthogonal functions, and then use novel methods such as decision trees and causal networks to identify drivers of subseasonal monsoon variability, to seek previously unknown sources of predictability in the tropics and extratropics.  The project will also assess the stratosphere, which has known interactions with ENSO and the MJO but whose influence on the monsoon is poorly explored.  We will use conditional approaches to understand whether the links between subseasonal drivers and monsoon rains, and thus subseasonal prediction skill, are dependent upon the state of slowly varying seasonal drivers such as ENSO, the IOD or QBO.  Modelling work at climate and convection-permitting scales will be used to understand to what extent subseasonal drivers can be represented, whether the kilometre scale is necessary to simulate such behaviour, and to perform case-study experiments to define the mechanisms involved.  We will leverage the new NERC MiLCMOP project (starting 2023) in which we test the role of the extratropics in monsoon onset using modelling and causal analysis techniques.  


 Breaking the barriers of predictability in hydrological forecasting 

Supervisor: Professor Hannah Cloke OBE 

Global forecasts of upcoming riverflows and water resources can now be made from 1 week to a few months ahead, but the skill of these forecasts varies widely in space and time.  Anticipatory humanitarian action has the potential to provide targeted, timely support before a disaster strikes. Humanitarian organisations operating in parts of the world with extremely vulnerable communities, such as those displaced by conflict in South Sudan, have reached out for improved information to support anticipatory action to prepare for floods and droughts. But it is in just such places that hydrological forecasting models have poor skill.    

The aim of this PhD is to investigate the current barriers to improving hydrological prediction at the Earth System scale, and consider how these interact to currently limit the lead time of skilful subseasonal hydrological forecasts. The project will explore how ongoing developments in predicting hydrological flows in Earth System could provide more accurate and earlier forecasts of upcoming floods and droughts.  These developments range from better representation of soil and snowmelt processes, groundwater dynamics, inclusion of dams, reservoir representation and upstream water management in the river channel network, improvements in the representation and postprocessing of precipitation, and new observations such as data from SWOT.  

The PhD researcher will be attached to the INFLOW project (Improved Anticipation of Floods on the White Nile) funded by the Climate Adaptation and Resilience Programme working alongside an international research team including mandated government agencies, regional forecasting bodies, academics and humanitarian partners.  


 Discovering the Mechanisms Behind “Forecast Busts” 

Supervisor: Professor Robert Plant 

Numerical weather prediction (NWP) occasionally presents “forecast busts”, in which the forecast skill at five- to six-day lead time drops to almost zero across the world’s leading NWP centres. Such failures can be linked via Rossby wave dynamics to an initial poor representation of Mesoscale Convective Systems (MCS) upstream, which is in turn related to systematic difficulties in simulating moist convection. Our main research question is why errors in the representation of an MCS are normally benign for predictability but occasionally catastrophic? Answering this will provide insight into how multi-scale error growth in NWP systems depends on flow regimes.  


The project will involve making detailed investigations of simulated cases and comparing situations that do and do not lead to busts. A key element will be understanding the development of errors in the forecast model: their propagation downstream and growth upscale from order 1km to the synoptic. This will involve analyses and interpretations in the frameworks of potential vorticity using process-oriented diagnostics to unravel the contributions of different physical mechanisms in the model. We will include simulations with different representations of convection, expected to lead to some significant changes at the MCS stage which may then alter the coupling to larger-scale weather patterns. Will new treatments and/or higher resolution solve the forecast bust problem, and why, or might they even make it worse? Depending on the student’s interests, there is also scope to include a machine learning element (specifically, convolution neural networks) to detect forecast busts from the initial conditions.  


 Developing Artificial Intelligence Approaches to Enhance Representations of Turbulence in Atmospheric Models 

Supervisor: Dr Todd Jones 

Turbulence is one of the many processes represented in simulations of the Earth System. Better understanding of the effects of turbulence on dynamics and clouds in weather/climate models would allow improving its parameterisation in these models. Explainable Artificial Intelligence (XAI) is concerned with allowing human users to better understand the role various variables play in complex models, while Machine Learning (ML) techniques such as parameter tuning allow tuning model parameters for better performance in downstream tasks. This project will investigate and contrast three approaches to the parameterisation of turbulence: traditional parameterisations, traditional parameterisations tuned using ML, and emulations of turbulence developed using XAI techniques. The goal of this project is to investigate whether AI/ML techniques can be used to improve the accuracy and computational efficiency of these parameterizations, either by improving scientific outcomes at the same computing cost or making simulations cheaper. The main outcomes of the project will be determining whether AI/ML techniques can improve the fidelity of the Met Office NERC convection (MONC) model simulations and making them more computationally efficient. We are very sure that the “right way” to use AI/ML in any given turbulence situation is not well known, but improving turbulence in MONC could lead directly to improvements in future weather and climate models. The project offers a PhD candidate an opportunity to learn, work with, and understand large modern scientific codes running on supercomputers and develop relevant research and technical skills in software engineering, data security and ethics alongside University and Met Office researchers. 


 Improving river streamflow forecasts using deep learning techniques 

 Supervisor: Dr Kieran Hunt 

Artificial neural networks – particularly those designed to process sequential data, known as long short-term memory networks (LSTMs) – can improve forecasts in areas where data sparsity or incomplete process knowledge challenges conventional dynamical models. This project will explore how LSTMs can fill such gaps, mitigate against biases in input data, and produce accurate river streamflow forecasts. To achieve this, the student will vastly expand on a prototype LSTM designed to ingest weather forecasts and produce operational streamflow forecasts over the US.   

Research questions include:  

  • What is the optimal setup for the LSTM? How are gridded variables best processed – by statistical preprocessing, or by additional components in the LSTM (e.g. convolutional layers)?  
  • How can km-scale hydrological models and science be improved to develop a hydrological Digital Twin?  
  • How well does the LSTM perform against benchmark dynamical models in both data-rich and data-poor regions? How can performance in data-poor regions be improved?  
  • Can the LSTM effectively forecast extreme flooding events that are beyond the distribution of the training data?  
  • What is the effect of climate change on streamflow and flooding risk in vulnerable catchments?   

A distinctive opportunity is for the student to get hands-on experience working with operational forecast data in real time. The student will have visiting scientist status at ECMWF enabling them to work closely with the forecast department and will undertake a full-time placement in order to operationally deploy the fully tested model.  


Improving the Efficiency of Weather and Climate Prediction by developing Mathematical Methods to take Long Time Steps 

Supervisor: Professor Hilary Weller 

State of the art weather and climate prediction models are efficient and even accurate when large time steps are taken due to the use of semi-Lagrangian transport schemes. However semi-Lagrangian is not conservative – the total amount of the transported quantities can change due purely to numerical errors. Next generation models, such as ECMWF’s FVM (finite volume model) are designed for higher resolution and designed to run on massively parallel computer archetectures and they do not use semi-Lagrangian transport so as to avoid conservation errors. However this means that they face time step restrictions that can be severe, for example in the presence of strong updrafts associated with severe weather. The supervisors have started to develop implicit time stepping schemes that are stable and accurate for much longer time steps and retain exact conseravation. These schemes need further development and testing before they can be used operationally. We need to answer questions such as:  

  1. Can larger time steps using implicit transport lead to reduced computational cost in comparison to (traditional) explicit transport with a smaller time step? 
  2. Would it be beneficial to use implicit transport only in the vertical direction, where it is most needed? 
  3. If implicit transport is used only locally, where it is most needed, how does this affect load balancing between multiple computing processors? 
  4. With the use of implicit time stepping, can we increase the vertical resolution without reducing the time step and still improve accuracy?