## What’s in a number?

By Nancy Nichols

Should you care about the numerical accuracy of your computer?  After all, most machines now retain about 16 digits of accuracy, but usually only about 3 – 4 figures of accuracy are needed for most applications;  so what’s the worry?   To demonstrate, there have been a number of spectacular disasters due to numerical rounding error.  One of the most well known is the failure of a Patriot Missile to track and intercept an Iraqi Scud missile in Dharan, Saudi Arabia, on February 25, 1991, resulting in the deaths of 28 American soldiers.

The failure was ultimately attributable to poor handling of  rounding errors.  The computer doing the tracking calculations had an internal clock whose values were truncated when converted to floating-point arithmetic with an error of about 2-20 .   The clock had run up a time of 100 hours, so the calculated elapsed time was too long by 2-20 x 100 hours = 0.3433 seconds, during which time a Scud would be expected to travel more than half a kilometer.

(See The Patriot Missile Failure)

The same problem arises in other algorithms that accumulate and magnify small round-off errors due to the finite (inexact) representation of numbers in the computer.   Algorithms of this kind are referred to as ‘unstable’ methods.  Many numerical schemes for solving differential equations have been shown to magnify small numerical errors.  It is known, for example, that L.F. Richardson’s original attempts at numerical weather forecasting were essentially scuppered due the unstable methods that were used to compute the atmospheric flow.   Much time and effort have now been invested in developing and carefully coding methods for solving algebraic and differential equations such as to guarantee stability.   Excellent software is publicly available.  Academics and operational weather forecasting centres in the UK have been at the forefront of this research.

Even with stable algorithms, however, it may not be possible to compute an accurate solution to a given problem.   The reason is that the solution may be sensitive to small errors  –  that is, a small error in the data describing the problem causes large changes in the solution.  Such problems are called ‘ill-conditioned’.   Even entering the data of a problem into a computer  –  for example, the initial conditions for a differential equation or the matrix elements of an eigenvalue problem  –   must introduce small numerical errors in the data.  If the problem is ill-conditioned, these then lead to large changes in the computed solution, which no method can prevent.

So how do you know if your problem is sensitive to small perturbations in the data?  Careful analysis can reveal the issue, but for some classes of problems there are measures of the sensitivity, or the ‘conditioning’, of the problem that can be used.   For example, it can be shown that small perturbations in a matrix can lead to large relative changes in the inverse of the matrix if the ‘condition number’ of the matrix is large.  The condition number is measured as the product of the norm of the matrix and the norm of its inverse.  Similarly  small changes in the elements of a matrix will cause its eigenvalues to have large errors if the ‘condition number’ of the matrix of eigenvectors is large.   Of course to determine the condition numbers is a problem implicitly, but accurate computational methods for estimating the condition numbers are available .

An example of an ill-conditioned matrix is the covariance matrix associated with a Gaussian distribution.   The following figure shows the condition number of a covariance matrix obtained by taking samples from a Gaussian correlation function at 500 points, using a step size of 0.1, for varying length-scales [1].The condition number increases rapidly to 107 for length-scales of only size  L = 0.2  and, for length scales larger than 0.28, the condition number is larger than the computer precision and cannot even be calculated accurately.

This result is surprising and very significant for numerical weather prediction (NWP) as the inverse of covariance matrices are used to weight the uncertainty in the model forecast and in the observations used in the analysis phase of weather prediction.  The analysis is achieved by the process of data assimilation, which combines a forecast from a computational model of the atmosphere with physical observations obtained from in situ and remote sensing instruments.  If the weighting matrices are ill-conditioned, then the assimilation problem becomes ill-conditioned also, making it difficult to get an accurate analysis and subsequently a good forecast [2].  Furthermore, the worse the conditioning of the assimilation problem becomes, the more time it takes to do the analysis. This is important as the forecast needs to be done in ‘real’ time, so the analysis needs to be done as quickly as possible.

One way to deal with an ill-conditioned system is to rearrange the problem to so as to reduce the conditioning whilst retaining the same solution.  A technique for achieving this is to ‘precondition’ the problem using a transformation of the variables.  This is used regularly in NWP operational centres with the aim of ensuring that the uncertainties in the transformed variables all have a variance of one [1][2].  In this table we can see the effects of the length-scale of the error correlations in a data assimilation system on the number of iterations it takes to solve the problem, with and without preconditioning of the problem [1].  The conditioning of the problem is improved and the work needed to solve the problem is significantly reduced.  So checking and controlling the conditioning of a computational problem is always important!

[1]  S.A Haben. 2011. Conditioning and Preconditioning of the Minimisation Problem in

Variational Data Assimilation, University of Reading, Department of Mathematics and Statistics, Haben PhD Thesis

[2]  S.A. Haben, A.S. Lawless and N.K. Nichols.  2011. Conditioning of incremental variational data assimilation, with application to the Met Office system, Tellus, 63A, 782–792. (doi:10.1111/j.1600-0870.2011.00527.x)

## First recording of surface flooding in London using CCTV cameras

On Friday 2nd of June 2017 Met Office issued a yellow warning of heavy rain with possible hail and lightning over London. Also Environmental Agency issued a number of flood alerts for London for the same period of time. This allowed us to test our newly setup system for recording open data CCTV images from London Transport Cameras (aka JamCams).

Following the flood alerts we setup to record all Transport for London (TFL) cameras which where within the main flood alert areas, these were 4 areas in London.

This resulted in downloading images from just over 110 CCTV cameras accross from  the marked areas in Figure 1. Dowload started on many cameras at 2:30pm on 2nd of June 2017 and continued for 24h with an image downloaded every 5min.

Many of these images showed heavy rain as it passed over London on the afternoon of the 2nd June 2017; some cameras even captured images of lightning which was seen over North London but we didn’t capture any images of flooding in the four coloured areas in Figure 1.

However, following the flooding allert on London for Transport site allowed us to capture surface flooding that happened on the North Circular road between 4-7pm resulting in traffic jams in the area.

The surface flooding was very localised and only one camera captured it, the one just below the blue circle in the Figure 4. We recorded both still and video images from this camera.

We are currently setting up similar systems to download live traffic CCTV images from Leeds, Bristol, Exeter, Newcastle, Glasgow, and Tewkesbury.

## 2017 Annual European Geosciences Union (EGU) Conference

by Liz Cooper

The 2017 Annual European Geosciences Union (EGU) conference was held at the International Centre in Vienna from 23rd to 28th April.  During that time over 14,000 scientists from 107 countries shared ideas and results in the form of talks, posters and PICOs .The PICO (Presenting Interactive COntent) format is a relatively new idea for presenting work, where participants prepare an interactive presentation. In each PICO session the presenters first take turns to give a 2 minutes summary of their work for a large audience. The PICOS are then each displayed on an interactive touch screen and conference delegates can chat to the presenters and get further details on the research, with the PICO for illustration. This format has features of both traditional poster and oral presentations and provides a great scope for audience participation. I saw several which took advantage of this, including a very popular flood forecasting adventure game by a fellow Reading Phd student Louise Arnal.

I was delighted to be able to present some of my own recent results at EGU, in a talk titled ‘The effect of domain length and parameter estimation on observation impact in data assimilation for inundation forecasting.’ (see photo)

Presenting at an international conference was a really valuable and enjoyable experience, if a little daunting beforehand. I found it a really useful opportunity to get feedback from experts in the field and find out more about work by people with related interests.

The EGU conference has many participants and covers a huge range of topics from atmospheric and space science to soil science and geomorphology. My research deals with data assimilation for inundation forecasting, so I was most interested in sessions within the Hydrological Sciences and Nonlinear Processes in Science programmes. Even within those disciplines there was a huge breadth of research on display and I saw some really interesting work on synchronization in data assimilation, approaches to detection of floods from satellite data and various methods for measuring and characterizing floods.

As well as subject-specific programmes, there was also a very good Early Career Scientist (ECS) programme at EGU, with networking events, discussion sessions and a dedicated ECS lounge with much appreciated free coffee!

EGU was a hugely enjoyable experience and Vienna is a beautiful city with excellent transport links. With so many parallel sessions it’s really essential to plan which talks and posters are a priority in advance but I would heartily recommend it to anyone involved in geosciences research.

## Sewer network challenge at MathsForesees study group 2017

by Sanita Vetra-Carvalho

The second Maths Foresees study group was held on 3rd-6th April 2017, hosted by the Turing Gateway to Mathematics at the Isaac Newton Institute, Cambridge. The Maths Foresees network was established in May 2015 under the EPSRC Living with Environmental Change (LWEC) umbrella to forge strong links between researchers in the applied mathematics and environmental science communities and end-users of environmental research. The Maths Foresees events take a collaborative approach to industry problem solving where over the course of four days, mathematical and environmental scientists explored real challenges posed by companies operating in the environmental sector.

In this second event, there were five industry challenges presented to the participants (around 50 in total) from three companies: JBA, Sweco and Environmental Agency. All of the challenges this year were linked to flooding issues:

I joined the group interested solving sewer modelling challenge proposed by Sweco and presented by James Franklin. The urban flood model InfoWorks ICM (Integrated Catchment Modeling) by Innovyze that is used by Sweco, comprises a subsurface sewer network and a street-level road surface model. The two are coupled via manholes but smaller drains/gullies are not included since the exact locations of gullies and drains are not known (it would be very costly in manpower to locate them) and more importantly it would be computationally unfeasible to directly model gullies in InfoWorks model. As a consequence, the model does not represent floodwater drainage correctly. In a typical simulation, floodwater stays on the road surface and does not drain away as it should. This results in an inaccurate flood extents, particularly in urban environments (see an image below of a typical simulation of a storm).

The challenge for the group was to see how we could improve the model representation of the collection network; that is how to represent gullies in the model to simulate a more realistic exchange (sinks and sources) of surface water between the sewer network and surface model.

Our group had two and half days to propose a solution. Our initial idea to couple a 2D surface shallow water model to a 1D sewer network model (also shallow water model) to model realistic fluid exchange between the two models turned out to be too difficult to accomplish in the limited time period. Hence, we concentrated our efforts on the main problem at hand, how to represent realistic sinks in the model without directly resolving gullies in the model. To this end, our group produced two 2D surface models: 2D shallow water model and 2D diffusive wave model. The second model was developed in parallel as in a future it would be easier to couple to a 1D drainage network. Our group run both models on an idealised road setting: 100m straight road with 3 manholes every 30m and 20 gullies every 10m, where directly resolved (see image below).

We compared runs where we resolved gullies directly on the mesh every 10m on both sides of the road (the case which is computationally unfeasible for Sweco to run but is the most realistic) to line sink runs where we averaged the effect of the number gullies on the road and removed the surface liquid from the model at each gridpoint that is adjacent to the pavement. Both of our 2D surface models showed that the line sink representation of the gullies removed approximately the same volume of surface water in the model as directly resolving each gully in the model thus making line sink solution a realistic and computationally affordable to represent the effect of gullies in the model. While our solution lacked the two-way flow exchange between the surface model and sewer network we proposed that if implemented in the InfoWorks model the volume of water sunk through line sinks would become a source in the sewer network through the nearest manhole in the model. Our findings and the proposed solution to the Sweco challenge was positively received by James Franklin. A full report of our solution will be published on Turing Gateway to Mathematics site over next two months.

I very much enjoyed being part of the Maths Foresees study group 2017 and am very thankful to all the organisers at MathsForesees network and Turing Gateway of Mathematics for organising this event as well as Isaac Newton Institute for hosting it. It was very refreshing to be ‘locked’ into the Isaac Newton Institute alongside other participants to solve these challenges in a mentally very rich and inspiring environment. The event naturally offered a very fruitful ground for networking too. I would encourage any mathematician interested in solving environmental problems to take a part in any future MathsForesees events!

## Welcome

by Sarah Dance

Data Assimilation for the REsilient City (DARE) is a research project and network funded by an EPSRC Senior Fellowship in Digital Technology for Living with Environmental Change.

Data assimilation is an emerging mathematical technique for improving predictions from large and complex forecasting models, by combining uncertain model predictions with a diverse set of observational data in a dynamic feedback loop. The project will use advanced data assimilation to combine a range of advanced sensors with state-of-the-art computational models and produce a step-change in the skill of forecasts of urban natural hazards such as floods, snow, ice and heat stress. For more information about the research programme click here.

The Fellowship is held by Dr Sarah L. Dance at the University of Reading and she is working together with a team of other researchers and stakeholders. The Fellowship will influence the future research agenda for how digital technologies can be applied in new and transformative ways to help the human and natural environment be more resilient and adaptable to climate change. In addition to an innovative research programme,  Dr Dance is acting as a Champion for this area, developing outreach activities to other researchers, policy makers and industry through workshops, networks and other mechanisms.