Core to the harmonisation process is having a good set of match-ups. In FIDUCEO we pinpointed as many sources of errors as possible occurring in space-based instruments and quantified the uncertainties associated with the measurements.
Whilst tackling the problem from the analytical side using mathematical models, we also needed to compare the results of the modelling with other sensors’ data to get a feeling for the quality of the derived mathematics.
In the FIDUCEO context, we call a matchup a “point” measurement that is matched by another “point” measurement sufficiently close in space and time – excluding the trivial case of neighbouring measurements from the same acquisition, of course. In other words, we required matchup pixels that covered the same place on earth acquired at almost the same time. The smaller the time-delta aimed at becomes, the more challenging the detection algorithms, especially when getting below the acquisition duration of a satellite data product (which is in the range of 45 minutes for polar-orbiting instruments).
In most cases, we extended this definition to also include the neighbouring pixels of a sensor acquisition, covering a symmetrical window of n by m pixels around the matchup point. This allowed us to perform some calculations to get a feeling for the data (e.g. check for homogeneity, cloud shadow detection, etc.).
We wanted to compare acquisitions of different spaceborne sensors which were spatially and temporally close so that we could assume a similar atmospheric condition. We also wanted to compare satellite sensor data with in situ measurements, like AERONET data, in order to have measurements close to the measured entity.
In order to do this we developed a variable software framework that allowed us to run various match-up tasks using several satellite- and ground-based instruments. The framework comprised an open design with various plugin points to allow for easy algorithmic updates or adding new sensor data.
One challenge was the appropriate handling of a large input dataset that contained multiple complete sensor mission datasets, spanning long global time series. At the core of the operation was a high-performance database that contained the metadata (acquisition time, geo-boundary, and more …) of every single satellite product and ground-measurement.
This database was used to detect possible match-up candidate products by comparing geometric intersections between the bounding geometries of sensor acquisitions and the sensing time ranges
In a next step, we applied a fine grained analysis that calculated the overflight timing for the intersected area. This information allowed us to reject false match-ups where one satellite passes the common area much earlier or later than the other one, compared to the time delta aimed at. We did this by applying a time-coding on the geometries which was derived from the satellite orbit parameters. We could retrieve the same information by opening the files and reading the data, but doing it without opening the files speeds up processing by a large amount.
At this point, we had the matchup locations and times – and the associated data files. During the following steps we extracted the data at the matchup-locations, performed checks on additional constraints on the data, like cloud detection or checks for data homogeneity or ground surface constraints.
When all this was done, the software writes the remaining data into matchup-files (usually one per month) that can be used for scientific analysis.
While there still is no “really true value” associated with spaceborne measurements, comparing data from different sensors measuring the same geophysical situation allows us to verify error theses and modelling mathematics and will give much stronger confidence in the results.
Establishing correlation structures for the match up process
It is important to ensure that the match up process recognises the error correlation between the radiance observations in different match ups. The file format definition provides a common input file format for all the FIDUCEO harmonisation processes and also shows how error correlation structures can be built into the input files:
Try this file for information about setting up the “W-matrix” that holds error correlation information: FIDUCEO-SH-W Matrix in Harmonisation.docx