wCROWN: Workshop on Crowdsourced data in Numerical Weather Prediction

wCROWN: Workshop on Crowdsourced data in Numerical Weather Prediction

by Sarah Dance

On 4-5 December 2018, the Danish Meteorological Institute (DMI) is hosted a workshop on crowdsourced data in numerical weather prediction (NWP), attended by Joanne Waller and Sarah Dance from the DARE project.  DMI  hosted this workshop with two aims, 1) Gather experts on crowdsourced data focused on NWP, to start a network of people working on the subject and 2) producing a white paper directing the research community towards best practices and guidelines on the subject.

Presenters from the University of Washington (Seattle), University of Reading and several operational weather centres including the Met Office (UK), German Weather Service (DWD), Meteo France, ECMWF, KNMI and EUMETNET gave us status reports on their research into using crowdsourced data, opportunistic data and citizen science. We discussed the issues arising in the use of such data and agreed to write a workshop report together to feed into EUMETNET activities. We also enjoyed a fascinating tour of the DMI  operational forecasters centre.

Flooding from Intense Rainfall

Flooding from Intense Rainfall

Several members of the DARE team were involved in the  NERC Flooding from Intense Rainfall (FFIR) programme open event, held at the Royal Society in London on 27 November 2018.

Dr Linda Speight, FFIR Policy and Impact officer wrote this overview of the event.

Over 3 million households are at risk of surface water flooding in the UK and this number is set to rise in the future. Surface water flood events happen quickly and affect small areas, the surrounding region may not see any rainfall at all. This makes them difficult to forecast.

Through the NERC funded Flooding from Intense Rainfall programme (FFIR), meteorologists, hydrologists, scientists, consultants and operational experts are working together to reduce the risks of damage and loss of life caused by surface water and flash floods.

The research includes everything from historic newspaper archives to drones and high speed computers. It has identified places vulnerable to flash flooding, developed new techniques for monitoring rivers during flood events, improved weather forecasts for intense rainfall and demonstrated the potential for real time simulation of floods in urban areas. Importantly the five year programme has helped improve communication between people in the hydrology and meteorological research communities. This will have lasting benefits into the future.

At the programme showcase event at the Royal Society in November 2018 there was a hands on opportunity to interact with the challenges of flooding from intense rainfall. Alongside presentations and an expert panel debate, participants could immerse themselves in a virtual reality simulation of a flash flood, watch convective rainfall develop on a giant forecast globe and share their thoughts on the modelling and communication chains that underpin flood forecasting.

A short video about the programme is available here

 

Or you can find out more details at http://blogs.reading.ac.uk/flooding/

Dr Sam Illingworth from Manchester Metropolitan University responded to the event with poetry:

 

After the Flood

 

When I thought of floods

I thought of the heavens breaking forth

In biblical proportions.

Forty days and forty nights of rain.

I thought of Boxing Day 2015;

The pain in my left hand as I scooped

Dirty water out of my in-law’s outhouse

Using nothing

More than a gravy boat and lashings

Of dampened Christmas spirit.

 

When I thought of floods

I thought about days of sustained rainfall.

It never even crossed my mind that Surface water flooding

Or thunderstorms could decimate the land

In hours;

Not days.

 

When I thought of floods

I thought about rain gauges and sandbags;

I didn’t think about how convective events form,

How soil moisture could be used to forecast flow,

Or how our future of flood defence

Could ever be bound to our arid past.

To my great shame I did not even consider:

The conditioning of least-squares problems in variational data assimilation.

 

When I thought of floods

I thought of observations;

Of closing the floodgates after the horse had bolted.

Observations that masked an inevitable inability

To adapt to our environments.

I thought of shattered communities,

Broken apart not just by the unrelenting force of the rising waters

But by the isolation and helplessness of being

Told to sit and wait in silence

For the cavalry to arrive.

 

But now….

Now when I think of floods

I think of our improved knowledge of catchment susceptibility,

And how this will help decision makers

Identify locations at risk of flooding.

I think of being able to forecast a flood event in real time,

And how this will enable better decision making and communication.

 

But most of all I think about people.

Of end-to-end-forecasting, knowledge sharing, and upstream engagement.

I think about how flood chronologies can

Provide a powerful data set

To develop storylines around flood histories;

Histories which can be used to engage local communities.

And how these communities can not only learn

To be resilient,

But can help to build the resilience

That we need;

To stop us all

From being washed away.

 

 

Our first DARE workshop

Our first DARE workshop

by Sarah Dance

Workshop participants

The DARE team organised a workshop on data science for high impact weather and flood prediction, held by the river at the lovely University of Reading Greenlands Campus in Henley-on-Thames, 20-22 Nov 2017. The workshop objectives were to enable discussion and exchange of expertise at the boundary between digital technology, data science and environmental hazard modelling, including

  • Data assimilation and data science for flood forecasting and risk planning
  • Data assimilation and data science for high impact weather forecasting
  • Smart decision making using environmental data

The meeting was attended by over 30 participants from  5 different countries. We had some great presentations ( to be made available on this webpage) and discussion. We came up with some recommendations to help promote and deliver research and business applications in the digital technology-environmental hazard area. We plan to write a meeting report detailing these recommendations that we hope will be published in a peer-reviewed international journal.  Watch this space!

 

What’s in a number?

What’s in a number?

By Nancy Nichols

Should you care about the numerical accuracy of your computer?  After all, most machines now retain about 16 digits of accuracy, but usually only about 3 – 4 figures of accuracy are needed for most applications;  so what’s the worry?   To demonstrate, there have been a number of spectacular disasters due to numerical rounding error.  One of the most well known is the failure of a Patriot Missile to track and intercept an Iraqi Scud missile in Dharan, Saudi Arabia, on February 25, 1991, resulting in the deaths of 28 American soldiers.

The failure was ultimately attributable to poor handling of  rounding errors.  The computer doing the tracking calculations had an internal clock whose values were truncated when converted to floating-point arithmetic with an error of about 2-20 .   The clock had run up a time of 100 hours, so the calculated elapsed time was too long by 2-20 x 100 hours = 0.3433 seconds, during which time a Scud would be expected to travel more than half a kilometer.

 

(See The Patriot Missile Failure)

The same problem arises in other algorithms that accumulate and magnify small round-off errors due to the finite (inexact) representation of numbers in the computer.   Algorithms of this kind are referred to as ‘unstable’ methods.  Many numerical schemes for solving differential equations have been shown to magnify small numerical errors.  It is known, for example, that L.F. Richardson’s original attempts at numerical weather forecasting were essentially scuppered due the unstable methods that were used to compute the atmospheric flow.   Much time and effort have now been invested in developing and carefully coding methods for solving algebraic and differential equations such as to guarantee stability.   Excellent software is publicly available.  Academics and operational weather forecasting centres in the UK have been at the forefront of this research.

Even with stable algorithms, however, it may not be possible to compute an accurate solution to a given problem.   The reason is that the solution may be sensitive to small errors  –  that is, a small error in the data describing the problem causes large changes in the solution.  Such problems are called ‘ill-conditioned’.   Even entering the data of a problem into a computer  –  for example, the initial conditions for a differential equation or the matrix elements of an eigenvalue problem  –   must introduce small numerical errors in the data.  If the problem is ill-conditioned, these then lead to large changes in the computed solution, which no method can prevent.

So how do you know if your problem is sensitive to small perturbations in the data?  Careful analysis can reveal the issue, but for some classes of problems there are measures of the sensitivity, or the ‘conditioning’, of the problem that can be used.   For example, it can be shown that small perturbations in a matrix can lead to large relative changes in the inverse of the matrix if the ‘condition number’ of the matrix is large.  The condition number is measured as the product of the norm of the matrix and the norm of its inverse.  Similarly  small changes in the elements of a matrix will cause its eigenvalues to have large errors if the ‘condition number’ of the matrix of eigenvectors is large.   Of course to determine the condition numbers is a problem implicitly, but accurate computational methods for estimating the condition numbers are available .

An example of an ill-conditioned matrix is the covariance matrix associated with a Gaussian distribution.   The following figure shows the condition number of a covariance matrix obtained by taking samples from a Gaussian correlation function at 500 points, using a step size of 0.1, for varying length-scales [1].The condition number increases rapidly to 107 for length-scales of only size  L = 0.2  and, for length scales larger than 0.28, the condition number is larger than the computer precision and cannot even be calculated accurately.

This result is surprising and very significant for numerical weather prediction (NWP) as the inverse of covariance matrices are used to weight the uncertainty in the model forecast and in the observations used in the analysis phase of weather prediction.  The analysis is achieved by the process of data assimilation, which combines a forecast from a computational model of the atmosphere with physical observations obtained from in situ and remote sensing instruments.  If the weighting matrices are ill-conditioned, then the assimilation problem becomes ill-conditioned also, making it difficult to get an accurate analysis and subsequently a good forecast [2].  Furthermore, the worse the conditioning of the assimilation problem becomes, the more time it takes to do the analysis. This is important as the forecast needs to be done in ‘real’ time, so the analysis needs to be done as quickly as possible.

One way to deal with an ill-conditioned system is to rearrange the problem to so as to reduce the conditioning whilst retaining the same solution.  A technique for achieving this is to ‘precondition’ the problem using a transformation of the variables.  This is used regularly in NWP operational centres with the aim of ensuring that the uncertainties in the transformed variables all have a variance of one [1][2].  In this table we can see the effects of the length-scale of the error correlations in a data assimilation system on the number of iterations it takes to solve the problem, with and without preconditioning of the problem [1].  The conditioning of the problem is improved and the work needed to solve the problem is significantly reduced.  So checking and controlling the conditioning of a computational problem is always important!

[1]  S.A Haben. 2011. Conditioning and Preconditioning of the Minimisation Problem in

Variational Data Assimilation, University of Reading, Department of Mathematics and Statistics, Haben PhD Thesis

[2]  S.A. Haben, A.S. Lawless and N.K. Nichols.  2011. Conditioning of incremental variational data assimilation, with application to the Met Office system, Tellus, 63A, 782–792. (doi:10.1111/j.1600-0870.2011.00527.x)

WMO Symposium on Data Assimilation

WMO Symposium on Data Assimilation

by Amos Lawless

In the middle of September scientists from all round the world converged on a holiday resort in Florianopolis, Brazil for the Seventh World Meteorological Organization Symposium on Data Assimilation. This symposium takes place roughly every four years and brings together data assimilation scientists from operational weather and ocean forecasting centres, research institutes and universities. With 75 talks and four poster sessions, there was a lot of science to fit in to the four and a half days spent there.

 

The first day began with presentation of current plans by various operational centres, and both similarities and differences became apparent. It is clear that many centres are moving towards data assimilation schemes that are a mixture of variational and ensemble methods, but the best way of doing this is far from certain. This was apparent from just the first two talks, in which the Met Office and Meteo-France set out their different strategies for the next few years. For anyone who thought that science always provides clear-cut answers, here was an example of where the jury is still out! Many other talks covered similar topics, including the best way to get information from small samples of ensemble forecasts in large systems.

 

In a short blog post such as this, it is impossible to discuss the wide range of topics that were discussed in the rest of the week, ranging from theoretical aspects of data assimilation to practical implementations. Subjects included challenges for data assimilation at convective scales in the atmosphere, ocean data assimilation, assimilation of new observation types (including winds from radar observations of insects, lightning and radiances from new satellite instruments) and measuring the impact of observations. Several talks proposed development of new, advanced data assimilation methods – particle filters, Gaussian filtering and a hierarchical Bayes filter were all covered. Of particular interest was a presentation on data assimilation using neural networks, which achieved comparable results to an ensemble Kalman filter at a small fraction of the computational cost. This led to a long discussion at the end of the day as to whether neural networks may be a way forward for data assimilation. The final session on the last day covered a variety of different applications of data assimilation, including assimilation of soil moisture, atmospheric composition measurements and volcanic ash concentration, as well as application to coupled atmosphere-ocean models and to carbon flux inversion.

 

Outside the scientific programme the coffee breaks (with mountains of Brazilian cheese bread provided!) and the social events, such as the caipirinha tasting evening and the conference dinner, as well as the fact of having all meals together, provided ample opportunity for discussion with old acquaintances and new. I came home excited about the wide range of work being done on data assimilation throughout the world and enthusiastic to continue tackling some of the challenges in our research in Reading.

The full programme with abstracts is available at the conference web site, where presentation slides will also be eventually uploaded:

http://www.cptec.inpe.br/das2017/

Serving society with better weather and climate information.

Serving society with better weather and climate information.

by Sarah Dance

I have just come back from the European Meteorological Society 2017 conference in Dublin, where I was co-convenor for a session on Data Assimilation. It’s theme was Serving Society with better Weather and Climate Information. A key challenge for the meteorological communities is how best to harness the wealth of data now available – both observational and modelled – to generate and communicate effectively relevant, tailored and timely information ensuring the highest quality support to users’ decision-making.  The conference produced some highlight videos that sum up the activities better than I could!

Can cars provide high quality temperature observations for use in weather forecasting?

Can cars provide high quality temperature observations for use in weather forecasting?

By Diego de Pablos

I am an Undergraduate student in the University of Reading that has recently finished his UROP placement (Undergraduate Research Opportunities Programme) in Reading University, this project was funded by the University and was in partnership with the Met Office. Since I am currently undertaking the Environmental Physics course at the Meteorology department, this project was of interest to me for two reasons: first, I plan on getting a PhD at Reading University and wanted to have a feel for that experience and secondly, the research topic seemed to have potential to improve weather forecasting and road safety overall. The project consisted on having a first look at the temperature observations from the built-in thermometer of a car, and compare them with the UKV model surface temperatures and nearby WOW [1] sites observations.

Even though the use of vehicles in weather forecasting has been studied before [2], advanced thermometers were installed on the vehicles to get the observations in most cases, or other parameters were used (i.e antilock brakes or windshield wipers states). This project aimed to assess the potential of the native ambient air temperature sensor most modern cars (less than ten years old) have. Having these observations available when predicting the road state in the nearby future.

A series of days of temperature observations registered by a car’s built-in thermometer were studied. The method used to extract these observed temperatures was an OBD dongle, which would be connected to the car’s engine management system via the standard OBD port cars have installed behind the steering wheel. The dongle would then send this information to the driver’s phone via Bluetooth. In the phone app, observations and other parameters available from the dongle are decrypted, and are later sent to a selected URL via 3G/4G connections. The data would then be stored in metdb, the database used by the Met Office in the UK, and made available for forecasting.

 

The trial showed a need for further testing regarding the thermometers, as it was suggested that the sensor readings could have a bias with height and speed. However, the potential availability of data, by sheer quantity alone is outstanding, as around 20 million cars would be available to take part in the data collection in the UK.

All in all, using car sensors for weather forecasting seems to have potential and will be studied thoroughly in the near future, to hopefully tie its advancements with those of car technologies.

References:

[1] Weather Observations Website – Met Office. https://wow.metoffice.gov.uk/. Accessed: 10th of August 2017.

[2] William P. Mahoney III and James M. O’Sullivan. “Realizing the Potential of Vehicle-Based Observations”. In: Bulletin of the American Meteorological Society 94.7 (2013), pp. 1007– 1018. doi: 10.1175/BAMS-D-12-00044.1. url: https://doi.org/10.1175/BAMS-D-12-00044.1

UFMRM WG Webinar: “DARE to use CCTV images to improve urban flood forecasts”

It is difficult to accurately predict urban floods; there are many sources of error in urban flood forecast due to unknown model physics, computational limits, input data accuracy etc. However, many sources of model and input errors can be reduced through the use of data assimilation methods – mathematical techniques that combine model predictions with observations to produce more accurate forecast.

In this talk I will motivate and introduce the idea of using CCTV images as a new and valuable additional source of information in cities for improving the urban flood predictions through data assimilation methods. This work is part of the Data Assimilation for REsilient City (DARE) project.

You can see the whole presentation on YouTube here or view slides here.

Wetropolis flood demonstrator

Wetropolis flood demonstrator

By Onno Bokhove, School of Mathematics, University of Leeds, Leeds.

  1. What is Wetropolis?

The Wetropolis flood demonstrator is a conceptual, life installation showcasing what an extreme rainfall event is and how such an event can lead to extreme flooding of a city, see below in Fig. 1. A Wetropolis day is chosen to be 10s and it rains on average every 5.5min for 90% of the time during a Wetropolis day, i.e., 9s in two locations both in an upstream reservoir and in a porous moor in the middle of the catchment. This is extreme rainfall and it causes extreme flooding in the city. It can rain either 10%, 20%, 40% or 90% in a day; and, either nowhere, only in the reservoir, only on the porous moor or in both locations. Rainfall amount and rainfall location are randomly drawn via two skew-symmetric Galton boards, each with four outcomes, see Fig. 2. Each Wetropolis day, so every 10s, a steel ball falls down the Galton board and determines the outcome, which outcome we can follow visually: at the first split there is a 50% chance of the ball going to the left and of 50% to the right, and the next two splits one route can only go right with a 100% chance and the other one splits even with 50%-50% again; subsequent splits are even again. An extreme event occurs with probability 7/256, so about 3% of the time. In 100 wd’s, or 1000s, this amounts to about every 5.5min on average. When a steel ball rolls through one of the four channels of the Galton board it optically triggers a switch and via Arduino electronics each Galton board steers pump actions of (1,2,4,9)s causing it to rain in the reservoir and/or the porous moor.

Fig. 1. Overview of the Wetropolis flood demonstrator with its winding river channel of circa 5.2m and the slanted flood plains on one side, a reservoir, the porous moor, the (constant) upstream inflow of water, the canal with weirs, the higher city plain, and the outflow in the water tank/bucket with its three pumps. Two of these pumps switch on randomly for (1,2,4,9)s of the 10s `Wetropolis Day’ (SI-unit: wd). Photo compilation: Luke Barber.

 

Wetropolis’ construction is based on my mathematical design with a simplified one-dimensional kinematic model representing the winding river, a one-dimensional nonlinear advection diffusion equation for the rainfall dynamics in the porous moor, and simple time-dependent box models for the canal sections and the reservoir, all coupled together with weir relations. The resulting numerical calculations were approximate but led to the design by providing estimates of the strength of the pumps (1-2l in total for the three aquarium pumps), the length and hence the size of the design with the river water residence time typically being 15-20s, and the size of the porous moor. The moor visually shows the dynamics of the ground water level during no or weak rainfall as well as strong rainfall, and how it can delay the through flow when the conditions are dry prior to the rainfall by circa 2-3wd (20-30s). When the rainfall is strong, e.g., for two consecutive days of extreme Boxing Day rainfall (see movie in [2]), the moor displays surface water overflow and thus drains nearly instantly in the river channel.

Fig. 2 Asymmetric Galton board. Every Wetropolis day, 10s, a steel ball is released at the top (mechanism not shown here). The 4×4 possible outcomes in two of such boards, registered in each by 4 electronic eyes (not shown here either), determine the rainfall and location in Wetropolis, repectively. Photo: Wout Zweers.

Wetropolis’ development and design was funded as an outreach project in the Maths Foresees’ EPSRC Living with Environmental Change network [1].

  1. What are its purposes?

Wetropolis was first designed to be a flood demonstrator in outreach purposes for the general public. It can fit in the back half of a car and can be transported. Comments from everyone, including the public, are positive. Remarks from scientists and flood practitioners such as people from the Environment Agency, however, made us realise that Wetropolis can also be used and extended to test models and explore concepts in the science of flooding.

 

  1. Where has Wetropolis been showcased hitherto?

The mathematical design and modelling was done and presented early June 2016 at a seminar for the Imperial College/University of Reading Mathematics of Planet Earth Doctoral Training Centre. Designer Wout Zweers and I started Wetropolis’ construction a week later. One attempt failed (see June 2016 posts in [2]) because I made an error in using the Manning coefficient in the calculations, necessitating an increase of the channel length to 5m to have sufficient residence time of water in the 1:100 sloped river channel. Over the summer progress was made with a strong finish late August 2016 so we could showcase it at the Maths Foresees’ General Assembly in Edinburgh [1]. It was subsequently shown at the Leeds Armley Museum public Boxing Day exhibit December 8th, 2016 and also in March 2017. I gave a presentation for 140 flood victims for the Churchtown Flood Action Group Workshop, late January 2017 in Churchtown, on the science of flood including Wetropolis. We showcased it further at: Be Curious public science festival, University of Leeds; the Studygroup Maths Foresees (see Fig. 3), at the Turing Gateway to Mathematics, Cambridge; and, a workshop of the River and Canal Trust in Liverpool.


Fig. 3. Wetropolis at the Turing Gateway to Mathematics. Photo TGM. Duncan Livesey and Robert Long (Fluid Dynamics’ CDT, Leeds) are explaining matters.

  1. What are its strengths and weaknesses?

The strength of Wetropolis is that it is a life visualisation of probability for rainfall and flooding in extreme events combined, river hydraulics, groundwater flow, and flow control, since the reservoir has valves such that we can store and release water interactively). It is a conceptual model of flooding rather than a literal scale model. This is both a weakness and a strength because one needs to explain the translation of a 1:200 return period extreme flooding and rainfall event to one with a 1:5.5min return period, explain that the moor and reservoir are conceptual valleys where all the rain falls, since rain cannot fall everywhere. This scaling and translation is part of the conceptualisation, which the audience, whether public or scientific, needs to grasp. The visualisations of flooding in the city and the ground water level changes will be improved.

  1. Where does Wetropolis go from here?

Wetropolis’ revisited is under design to illustrate aspects of Natural Flood Management such as slowing-the-flow by inserting or taking our roughness features, leaky dams and the great number of such dams needed to create significant storage volume of flood waters, as well as the risk of their failure. Wetropolis will (likely) be shown alongside my presentation in the DARE international workshop on high impact weather and flood prediction in Reading, November 20-22, 2017. Finally, analysis of river levels gauges combined with the peak discharge of the Boxing Day 2015 floods of the Aire River leading to the extreme massive flooding in Kirkstall, Leeds reveals that the estimated flood excess volume is about a 1 mile by 1 mile by 1.8m deep (see [3] and Fig. 4). Storing of all this excess flood volume in 4 to 5 artificially induced and actively controlled flood plains upstream of Leeds seems possible. Moreover, it could possibly have prevented the floods. Active control of flood plains via moveable weirs is now considered, also in a research project with Wetropolis featuring as conceptual yet real test environment. (PhD and/or DARE postdoc posts are available soon.)

Fig. 4. Leeds’ flood levels at Armley Mills Museum: 1866: bottom, 2015: top, 5.21m. Photo O.B. with Craig Duguid (Fluid Dynamics’ CDT, Leeds) showcasing Wetropolis.

 References and links

[1] Maths Forsees UK EPSRC LWEC network [2] Resurging Flows, public page with movies of experiments, river flows and Boxing Day 2015 floods in Leeds and Bradford, photos and comments on fluid dynamics. Two movies on 31-08-2016 show Wetropolis in action. In one case two consecutive extreme rainfall events led to a Boxing Day 2105 type of flood. (What is the chance of this happening in Wetropolis?) Recall that record rainfall over 48hrs in Bingley and Bradford, Yorkshire, contributed for a large part to the Boxing Day floods in 2015. [3] ‘Inconvenient Truths’ about flooding . My introduction at the 2017 Study Group.