The Ensemble Club

by Javier Amezcua

Meteorological operational centres have a great responsibility: they produce and revisit forecasts which are periodically released to the public. These forecasts (and their accuracy) are of great importance: from people planning their daily activities to governments allocating resources to different regions which expect extreme weather. Modern forecasting requires running a computer model based on the numerical solutions of some physical equations. The forecast has to be initialised somehow, e.g. by combining a previous forecast and observations, and then run for a given lead-time. This is done routinely. The process of obtaining the best initial conditions and revisiting them periodically is part of what is known as data assimilation.

In the olden days, operational centres ran a single-trajectory forecast, i.e. they initialised the forecast with a single set of initial conditions. But the initial conditions are never 100% accurate, they always have some uncertainty. How does the initial uncertainty translate into forecast uncertainty? Is this even important? Actually, it is very important, especially in chaotic dynamical systems (a class to which the atmosphere belongs) where tiny differences in initial conditions can lead to completely different situations after a given time. As for how to quantify the evolution in uncertainty, people realised a simple way is to run forecasts starting from different initial conditions chosen to represent the uncertainty in the estimate of the atmosphere at the starting time for the forecast. There are difficulties in this process: on one hand we have the methodological challenges of how to create initial perturbations and how to update them with adequate assimilation methods, and on the other we have the computational cost of running a expensive model more than once.

An example of an ensemble forecast is shown in figure 1. Back in the autumn of 2012 tropical cyclone decided to venture into the extra-tropics and make landfall close to New York City, causing havoc and disruption. The figure shows the best estimate location for the centre of the storm at 00UTC on 28 October 2012. From a family of 20 different slightly different initial conditions, different forecast trajectories are generated. Notice how the 24-hour forecasts do not differ much, but the difference grows considerably in the 48-hour forecasts, and by 72 hours the locations of the different forecasts are quite different.

Figure 1: Ensemble forecasting for the path of tropical cyclone Sandy. Produced by US NCEP with the GFS model.

In February 2020 I was invited to participate in a conference in India organised by the National Centre for Medium Range Weather Forecasting (NCMRWF) in Dehli. The International Conference on Ensemble Methods in Modelling and Data Assimilation (EMMDA), was organised to celebrate the unveiling of a new ensemble prediction system with global coberture and a very decent resolution (about 12 km). This is the result of a massive undertaking that lasted years in preparation. Furthermore, it is fruit of collaborations between the NCMRWF and the UK MetOffice, since the new system is inspired and largely modelled after the Met Office Global and Regional Ensemble Prediction System MOGREPS. With their new ensemble system, India joins a selective group of operational centres with forecast/assimilation systems which includes those of USA, France, UK, Germany, Canada, Japan, Australia, China, Korea, and Brazil. The European Centre for Medium Range Weather Forecast (ECMWF) produced their first ensemble forecasts in 1992 in an operational setting.

 

Figure 2: Logo for the International Conference on Ensemble Methods in Modelling and Data Assimilation depicting a schematic representation of 4 ensemble members in blue and the ensemble mean in red.

In the reunion experts from around the globe spoke about the latest advances in ensemble DA methods, including the Ensemble Kalman filter and smoother, particle filters, and ensemble-variational methods. The challenges of handling high resolution modelling which allows us to represent more complex features of the atmosphere is something of great importance. Some interesting applications were also discussed. These were quite varied, including atmospheric, ocean and Earth-system areas. In particular I spoke about using the Ensemble Kalman filter to estimate middle- and upper-level atmospheric winds using observed infra-sound waves coming from ammunition explosions. I have discussed this work in a previous blog which can be found here. I would like to acknowledge DARE for funding my participation in this event. Kudos to India, and welcome to the club!

Machine Learning for Landmark Detection in Biomedical Applications

I am Rémy Vandaele, a new post-doc who started working for the DARE project this January 2020. I got a PhD degree in Computer Sciences from the University of Liège in Belgium. I am a researcher who works in the area of machine learning for computer vision. In this blog post, I will summarize the work I did during my PhD thesis, entitled « Machine Learning for Landmark Detection in Biomedical Applications ».

Figure 1 : 25 landmarks annotated on the image of a zebrafish larvae

The starting point of my thesis was about developing and comparing automated landmark detection methods. In this work, a landmark corresponds to a specific anatomic location on the image of a body (cf Fig. 1). It is defined by a type (tip of the nose, left corner of the left eye,…) and has coordinates. Detecting a landmark on a new image means finding its coordinates, given its type. In my work, I focused on landmark detection for two biomedical applications :

– Morphometry : biologists need to annotate landmarks on high resolution microscopy images of different types of bodies to understand the impact of their different experiments (e.g finding if the injection of a drug has impacted the size of a fish,…). Manual annotation is a long, tedious and error-prone process that limits the size of the experiments. Automation is severely needed in this area.

– CT-CBCT registration : in radiotherapy, a treatment plan (beam positioning & dosage) for all the sessions to come are computed on an initial, high resolution CT-scan. At each of the session, the patient is positioned on the table and needs to be positioned accordingly to its initial CT-scan. This is performed using a CBCT scan that is then registered to the CT-scan using the deformation computed between corresponding, manually annotated landmarks (see Fig. 2). Landmark positioning on 3D CT-scan is a slow process, that could be sped-up using automation.

Figure 2: How to register images using landmark detection

My work on this topic can be divided into three parts :

  1. I developed a 2D method based on binary image patch classification using a Random Forest [1] algorithm. During this work [2],[3],[4] , I was able to show (1) the relevance to use multi-resolution patch descriptors in order to increase the accuracy of the landmark detection algorithm (2) we could reduce the search zone of the landmark, and thus greatly increase the speed of detection, without losing accuracy. Those algorithms were implemented into an open-source client-server application for the analysis and sharing of large (bio)images, Cytomine [5] [6].
  2. I expanded this method to 3D in order to show that we could use my method for 3D CT to CBCT image registration [7].
  3. I studied the impact of state-of-the-art post-processing methods [8] (methods taking into account the relative position between the detected landmarks to refine their locations) in the context of biomedical applications, where the number of images and landmarks is significantly lower than in typical face detection applications where landmark detection is widely used. In this context, I showed that those methods brought little to no increase in detection accuracy, and proposed a new post-processing method suited for landmark detection in biomedical applications.

Future perspectives regarding this work will consider deep learning classification and regression models in combination with data augmentation approaches. This research is currently pursued at the University of Liège.

References

[1] Geurts, Pierre, et al.  “Extremely randomized trees.” Machine learning 63.1 (2006): 3-42.

[2] Vandaele, Rémy, et al. “Automatic cephalometric x-ray landmark detection challenge 2014: A tree-based algorithm.” ISBI (2014).

[3] Wang, Ching-Wei, et al. “Evaluation and comparison of anatomical landmark detection methods for cephalometric x-ray images: a grand challenge.” IEEE transactions on medical imaging 34.9 (2015): 1890-1900.

[4] Vandaele, Rémy, et al. “Landmark detection in 2D bioimages for geometric morphometrics: a multi-resolution tree-based approach.” Scientific reports 8.1 (2018): 1-13.

[5] Marée, Raphaël, et al. “Collaborative analysis of multi-gigapixel imaging data using Cytomine.” Bioinformatics 32.9 (2016): 1395-1401.

[6] Rubens, Ulysse, et al. “BIAFLOWS: A collaborative framework to benchmark bioimage analysis workflows.” bioRxiv (2019): 707489.

[7] Vandaele, Rémy, et al. “Automated landmark detection for rigid registration between the simulation CT and the treatment CBCT.” Acta Stereologica (2015).

[8] Vandaele, Rémy. “Machine Learning for Landmark Detection in Biomedical Applications.” Diss. Université de Liège,​ Liège,​​ Belgique, 2018.

A new chapter begins!

by Guannan Hu

I am glad to have joined the DARE project team and to start working at the Department of Meteorology at the University of Reading. The working environment is fantastic. People are very friendly. They helped me settle in well and always make me feel welcome.

I completed my PhD at the University of Hamburg last year. My research areas were data assimilation and extreme value theory. I attempted to find general ways that can improve the performance of data assimilation. I also investigated whether data assimilation can be accurate in predicting extreme events. I wanted to figure out what the main factors are that prevent us from achieving accurate predictions. Is it observation, model forecast, or data assimilation algorithm?

From my BSc to PhD, I have always been engaged in meteorology. I could not image this at the very beginning as I thought that the people studying meteorology would be weather broadcasters on television later. My journey continues, and I am ready for the new chapter! Image taken from the meteorology institute on the University of Hamburg.

Data-assimilation of crowdsourced weather stations for urban heat and water studies in Birmingham as a testbed

Data-assimilation of crowdsourced weather stations for urban heat and water studies in Birmingham as a testbed

DARE pilot project with participants:  Professor Lee Chapman, University of Birmingham;  Sytse Koopmans; Gert-Jan Steeneveld, Wageningen University & Research (WUR)

The research will be a verification by ground truth observations of the Birmingham Urban Climate Lab (BUCL) observational network.

The worktask aims are:

  • Setting up basic model run with WRF with meteorological boundaries from ECMWF
  • Remapping land use from CORINE land cover inventory (100 m) to USGS classification used by WRF
  • Calculate land use fraction by Landsat 8 satellite
  • Derived urban morphology indicators with the NUDAPT tool

For the project we completed the model infrastructure without data assimilation and a basic run without was conducted accordingly. The main effort done in the last year was to create a geographical dataset that satisfy the 200 meter resolution. Before applying data assimilation it is important to describe the land use and urban characteristics as well as possible.

  • The default available land use datasets were too course for a 200m run. We have remapped the 100m Corine land use dataset to a format which can be read by WRF.
  • The urban morphology has been improved by applying the National Urban Database and Access Portal Tool (NUDAPT) (Ching et al, 2009*). With this step we progressed the representation of urban morphology in the WRF model compared to previous studies.

NUDAPT has been tested against more elementary and basic representations in WRF in a study for a Chinese city. The performance of NUDAPT looks promising.

Image 1 – Mean building height Birmingham

 

Image 2: Land use fraction Birmingham

*Ching J, Brown M, McPherson T, Burian S, Chen F, Cionco R, Hanna A, Hultgren T, Sailor D, Taha H, Williams D. 2009. National Urban Database and Access Portal Tool, NUDAPT. Bulletin of the American Meteorological Society 90( 8): 1157– 1168.

Particle Filters for Flood Forecasting  (PFFF)

Particle Filters for Flood Forecasting  (PFFF)

A collaboration between: Dr Renaud Hostache, Luxembourg Institute of Science and Technology (LIST); Professors Nancy K Nichols and Peter Jan vanLeeuwen, University of Reading; Ms Concetta di Mauro, Luxembourg Institute of Science and Technology (LIST)

The objective  of this DARE pilot project is to investigate the application of advanced filters to assimilate high-resolution flood extent information derived from SAR images (75m spatial resolution) for the purposes of improving near real-time flood forecasts.  The forecasting system is composed of a hydrological model loosely coupled to a hydraulic model with uncertain rainfall forcing (from ERA interim).  The ensemble of model outputs is compared to satellite-derived flood probability maps taking into account satellite image classification uncertainty.  Standard ensemble Kalman filter (EnKF) methods that assume a normal distribution of the observation errors cannot be applied and therefore new filters need to be developed for the assimilation.  From experiments already carried out at LIST, three challenges arise:  (i) to prevent ensemble members/particles being given a weight of zero solely due to local mismatch at a few pixels;  (ii) to reduce biases due to over-prediction of flood extent (false positive) being penalized more strongly than under-prediction; and  (iii) to reduce the risk of particle degeneration, where weights for all but a few particles go to zero. The aim of the project will be to assess how these challenges can be met using new advanced filters that are being developed at the University of Reading, such as equal-weight particle filters and variational mapping particle filters.

Flood forecasting chains have been set up to enable the evaluation of the proposed data assimilation filters in controlled environments using synthetic (twin) experiments.  Two studies have been carried out using these systems.

  1. We first use a variant of a Particle Filter (PF), namely a PF with Sequential Importance Sampling (PF-SIS), to assimilate flood extent in near real-time into a hydrological hydraulic-model cascade. To reduce the risk of particle degeneration, a “tempering” power factor is applied to the conditional probability of the observation given the model prediction (also called likelihood in a PF). This allows inflation of the model posterior distribution variance. Various values of the “tempering coefficient”, leading to different Effective Ensemble Size (EES) are evaluated. The experiment shows that the assimilation framework yields correct results in the sense that the assimilation updates the particle weights so that the updated predictions move towards the synthetic truth. It also shows that the proposed tempering factor helps in reducing degeneracy while inflating posterior distribution variance. Fig. 1 shows the synthetic truth together with the ensemble expectations (ensemble weighted means) for the open loop (no assimilation) and the assimilation (using various tempering factor values) runs.  As shown in this figure, the experiment also demonstrates that the reduction of degeneracy is at the cost of a slight degradation of the overall performance as the higher the EES, the lower the performance of the assimilation run. This is shown by the black and blue lines moving closer to the synthetic truth (compared to orange and light blue line Figure 1).
  2. We also investigated how innovative satellite earth observations of soil moisture and flood extents can help in reducing errors and uncertainties in conceptual hydro-meteorological modelling, especially in ungauged areas where potentially no, or limited, runoff records are available. A spatially distributed conceptual hydrological model was developed to allow for the prediction of soil moisture and flood extents. Using rainfall and potential evapotranspiration time series derived from the globally and freely available ERA5 database as forcing of this model, long-term simulations of soil moisture, discharge and flood extents were carried out. Time series of soil moisture and flood extent observations derived from freely available satellite image databases were then jointly assimilated into the hydrological model in order to retrieve optimal parameter sets. The performance of the calibrated model was evaluated using the tempered PF in twin experiments. This synthetic experiment shows that the assimilation of long time series (~10 years) of observations of flood extents and soil moisture maps acquired every three days enable a satisfactory calibration of the hydrological model. The Nash Sutcliffe Efficiency, computed based on the comparison of simulated and synthetic discharge time series, reach high values (above 0.95) both during the calibration period and a 10-year validation period.

Figure 1: Water surface elevation time series at Saxons Lode: synthetic truth (red), open-loop (green) and assimilation experiments using the standard PF-SIS (black), and using various tempering factor values (blue, light blue and orange) enabling various effective ensemble sizes to be reached (indicated between parentheses as percentage of the ensemble size). The vertical dashed lines indicate the assimilation time steps. PF-SIS=Particle Filter with Sequential Importance Sampling. EES=Effective Ensemble Size.

 

A Toolkit for Community-Based Flood Monitoring

A Toolkit for Community-Based Flood Monitoring

by Elizabeth Lewis, Geoff Parkin Tessa Gough, Newcastle University

This DARE pilot  project addresses the Digital Technology/Living with Environmental Change Interface by supporting communities at risk of flooding to utilise emerging technology to play an active role in collecting and sharing data to aid flood response and to sensitize communities to climatic changes by providing an enhanced understanding of their environment.

The project is engaging with potential citizen scientists to use Private Automated Weather Stations (PAWS) to gather rain data and share those data to online platforms where they can be accessed by the public, forecasters, flood managers and researchers. Engagement activities are being run promote citizen science participation and to better understand the motivations and barriers to rain data collection.

Workshops have been hosted in Newcastle museums over the school holidays to share information on the risks of flash flooding, natural flood attenuation, checking rain data online, operating a weather station and interpreting rain gauge data. Games illustrating the key concepts were available; along with posters, demonstrations and discussion with participating climate scientists from Newcastle University. Over 800 people interacted with the games and demonstrations over 4 days. The cost of weather stations was cited as a reason for not collecting rain data, along with not knowing it was possible. The workshops raised awareness among people who had not previously considered participating. Participants were directed to a website (https://research.ncl.ac.uk/cspaws/)  and support email address that we have developed for this project to support citizen scientists with any issues around installing a weather station and contributing their data to the Met Office Weather Observations Website (https://wow.metoffice.gov.uk/).

A presentation was given to 42 teachers in the North East to encourage schools to get involved with the project. A workshop will be held with 25 children age 9 – 13 years old at a local school to look in detail at how weather stations work, and, the impacts of wild weather globally and locally. Further workshops with the schools are planned.

Images in order from left to right: Promotional Poster; Elizabeth Lewis at a workshop in a museum in Newcastle; A game at the workshops

Merging SAR-derived flood footprints with flood hazard maps for improved urban flood mapping

Merging SAR-derived flood footprints with flood hazard maps for improved urban flood mapping

Contributors: David Mason (University of Reading), John Bevington (JBA), Sarah Dance (University of Reading), Beatriz Revilla-Romero (JBA), Richard Smith (JBA), Sanita Vetra-Carvalho (University of Reading), Hannah Cloke (University of Reading).

This DARE pilot project is investigating a method of improving the accuracy of rapid post-event flood mapping in urban areas by merging pre-computed flood return period (FRP) maps with satellite synthetic aperture radar (SAR)-derived flood inundation maps. SAR sensors have the potential to detect flooding through cloud during both day- and night-time. The inputs are JBA’s Flood Foresight dynamic flood inundation extent and depth maps (updated every 3 hours), and a high resolution SAR image sequence. The SAR returns are used only in rural areas, including those adjacent to the urban areas, so that there is no need to take radar shadow and layover caused by buildings in urban areas into account. Also, rural SAR water level observations should be able to correct errors in model water elevations, because the JBA model thinks that all flooding is fluvial. On the other hand, it is an advantage to use the model’s FRP maps in urban areas, because these know where urban areas that are low are protected from flooding.

The project developed a method for detecting flooding in urban areas by merging near real-time SAR flood extents with model-derived FRP maps. The SAR flood detection is based on the fact that water generally appears dark in a SAR image. Urban areas that are protected (e.g. by embankments) have high return periods in the FRP maps, and their effective heights are correspondingly increased. The SAR water levels found in the rural areas are interpolated over the urban areas to take into account the fall-off of levels down the reach. The model waterline heights are similarly interpolated. The interpolated height maps from SAR and model are combined to a single map, which is used to estimate whether an urban pixel is flooded. The method was tested on urban flooding in West London in February 2014 (see image 3) and Tewkesbury in July 2007. It was compared to a previously-developed method that used SAR returns in both the rural and urban areas. The present method using SAR returns solely in rural areas gave an average flood detection accuracy of 94% and a false positive rate of 9% in the urban areas, and was more accurate than the previous method. A journal paper is in preparation.

 

Images: Urban flooding in West London in February 2014

Controlling and mitigating urban flooding with DA

Controlling and mitigating urban flooding with DA

by Prof Onno Bokhove and Tom Kent, PDRA,  University of Leeds
(The University of Leeds is a collaborator on the DARE project).

Motivated by the Boxing Day 2015 floods in Yorkshire (involving the Aire and Calder Rivers), we aim (i) to explore strategies of dynamic flood control and mitigation, and (ii) to assess and communicate flood-mitigation schemes in a concise and straightforward manner in order to assist decision-making for policy makers and inform the general public. To achieve our objectives, we are developing idealised observing system simulation experiments (OSSEs) using novel numerical models based on the Wetropolis flood demonstrator. Wetropolis is a physical model that provides a scientific testing environment for flood modelling, control and mitigation, and data assimilation, and has inspired numerous discussions with flood practitioners, policy makers and the public. Such discussions led us to revisit and refine a procedure that offers both a complementary diagnostic for classifying flood events (from gauge data and/or simulations) and a protocol to optimise the assessment of mitigation schemes via comprehensible cost-effectiveness analyses.

We have developed a protocol that revisits the concept of flood-excess volume (FEV). It is often difficult to grasp how much water is responsible for the damage caused by an extreme flood event, and how much of this floodwater can be mitigated by certain mitigation measures. Our protocol not only quantifies the magnitude of a flood but also establishes the cost-effectiveness of a suite of ‘grey’ engineering-based measures and ‘green’ nature-based solutions. Using river-level gauge data and mitigation schemes from the UK and French rivers, we demonstrate objectively the effectiveness of measures that can help stakeholders make decisions based on both technical and environmental criteria. The protocol should form a preliminary analysis, to be conducted prior to more detailed hydraulic modelling studies. In collaboration with colleagues from Univ. Grenoble, our work has been published in an international journal and further disseminated at numerous meetings and conferences. To date, it has contributed to the EU-funded NAIAD project through our colleagues in France and we are exploring future impact studies internationally. In our recently submitted article, a basic numerical model of Wetropolis is used to determine the relevant time and length scales prior to its construction as a physical model. We are developing the hydrodynamic modelling further, both mathematically and numerically, in order to conduct idealised experiments in flood control and mitigation.

Image  ‘FEV concept’

Presentations

  • Bokhove, O., Kelmanson, M. A., Kent, T., Piton, G., & Tacnet, J. M.: Using flood-excess volume to assess and communicate flood-mitigation schemes. EGU general assembly, Vienna, April 2019 (oral). Available online.
  • Bokhove, O., Kent, T., de Poot, H., & Zweers, W.: Wetropolis: models for education and water-management of floods and droughts. EGU general assembly, Vienna, April 2019 (poster). Available online.
  • Kent, T., Cantarello, L., Inverarity, G., Tobias, S.M., Bokhove, O. (2019): Idealized forecast-assimilation experiments and their relevance for convective-scale Numerical Weather Prediction. EGU general assembly, Vienna, April 2019 (oral). Available online.
  • Bokhove, O., Kelmanson, M. A., Kent, T., Piton, G., & Tacnet, J. M.: Public empowerment in flood mitigation, Flood & Coast conference, Telford, June 2019 (oral).
  • Bokhove participated in the ‘Landscape decisions’ program at the Isaac Newton Institute, Cambridge (July/August 2019). Web: https://www.newton.ac.uk/event/ebc

SeriousGeoGames – Inundation Street- Flood Warning Video

by Chris Skinner – Research Fellow, Energy and Environment Institute, University of Hull & BetaJester Ltd

Inundation Street is a pilot project funded by DARE. It will create an immersive 360 video experience highlighting the impact of flooding on households. The video will demonstrate some simple steps that can be taken to reduce these impacts. The YouTube video will be available for free and will be exhibited using virtual reality headsets at events.

The development of the application is in progress. A timeline and script has been finalised. A voiceover has been recorded.  A demo of the application was shown at the Flood and Coast conference, with interest expressed by the Environment Agency. Discussions are in place with Hull City Council and the Living with Water partnership about using the application.

 

 

 

Investigation of the ability of the renewed UK operational weather radar network to provide accurate real-time rainfall estimates for improved flood warnings.

Investigation of the ability of the renewed UK operational weather radar network to provide accurate real-time rainfall estimates for improved flood warnings.

by Dr RobThompson and Prof Anthony Illingworth, Dept of Meteorology, University of Reading

The UK operational radar network has the potential to deliver real-time rainfall estimates every five minutes with a resolution of 1km2 over most of the populated areas of the UK. If these rain rates were accurate then such data would have a major impact on the ability to predict short term ‘flash’ flooding events so that mitigating action can be taken. However, at present, for flood warnings, the accuracy is deemed insufficient, so the radar rainfall estimates from each radar are continuously adjusted using recent observations by ground-based rain gauges.

The UK radar network has recently been renewed and upgraded to dual polarisation, resulting in much improved data quality. In this study we will compare the radar signal obtained every five minutes from  the operational Dean Hill radar with the rain rate at the ground some 20km from the radar and just 400m below the radar beam and use this to validate and improve the retrieval algorithms that convert the radar return signal, or ‘reflectivity’ into a rain rate at the ground.

In the preparatory work  for this DARE pilot project we have been comparing the radar reflectivity observed every five minutes with the scanning Dean Hill radar 20km distant from Chilbolton where we have five different high resolution rain gauges. The radar pulse samples a volume 300m by 300m by 600m and is at a height of 400m above the gauges (see image).  One of these gauges measures the size distribution of the rain drops and so once we know the sizes and number of the drops we can calculate the radar reflectivity we would expect from the radar. Over the past two years we find close agreement, but there appears to be a slow drift in the radar calibration of about 60%. In collaboration with the Met Office we are trying to find the source of this drift.  However, once we correct for this drift, we find that with the new radar and its improved data quality there is a close correspondence between the rain from the radar and that observed at the ground.  This performance appears to be much better than previously obtained before the radars were upgraded.

 

Image. Multiple rain gauges