The SAR (Synthetic Aperture Radar) data acquired today by remote sensing satellites require a particularly large storage capacity: 1 Gigabyte for a couple of ERS scenes (100 km by 100 km), more than 20 GB of raw data for a time series analysis (> 40 images), expanding into more than 140 GB during data processing. To allow a complete and systematic exploitation of the huge data mass in SAR data archive, it is necessary to develop specific processing techniques, that allow to extract geological or geomorphological or tectonic knowledge. The availability of large data sets, together with uncertainties on physical model parameters (ground truth and/or acquisition systems), leads to the implementation of data mining techniques and information fusion. The EFIDIR project objective is to develop an open platform of archival storage and of processing, fitted on one hand to SAR data specificity and, on the other hand, to large time series of SAR images from which ground motion measures are extracted. In the nineties, thanks to data acquired by European (ER1 and ERS2), Japanese (JERS) and Canadian (Radarsat-1) SAR satellites, it was shown that a judicious handling of time series on a given site could lead to the detection, on large areas, of ground motions of the order of a wavelength fraction. In particular, in examples of urban subsidence, differential interferometry (DinSAR) on a network of stable scatterers (also called Permanent Scatterers or PS) has been claimed to reach a millimetre per year accuracy on displacement rates, however, it requires large time series (more than 40 images). Today, a new generation of SAR satellites are -or will soon be- launched on orbit: ENVISAT, ALOS, Radarsat-2, Terrasar-X, and Cosmo-Skymed. Often combined with new acquisition modes (as polarimetry), these sensors deliver new data, needing even more data storage capacity (as better resolved), opening new research fields, but also requiring a change or even a new formulation of the processing chains that were used until today. The need of a project "Masse de Données et Connaissances" covering the broad spectrum of the knowledge processing chain stems on one hand from the data type and volume (RAW data or multi-variate Single Look Complex data, provided by space agencies), and, on the other hand, by the various phenomena that perturb and hide the information we are looking for: speckle, decorrelation, atmospheric artefacts.... To gather thematic information from multi-temporal nterferometic and polarimetric data, a complex processing chain must be undertaken (SAR synthesis, interferogram computing, PolSAR/PolInSAR decomposition, phase unwrapping, geocoding, artefact correction, geophysical model inversion,...). Today, this chain only partially exists through expensive and closed commercial software, that are very difficult to adapt on new geophysical contexts or new data types (high resolution, polarimetry...), for which they were not initially planned. Therefore, for this specific data base, the design, the realisation, and the validation of specific codes complementary to already available tool elements as free software are the ultimate goal of this project. The proposed application characterizes the expectations of "end-user" geophysicists towards a "signal processing" community to design and make operational original approaches able to overcome current bolts, thanks to a data mass exploitation and by taking into account geophysical expertise. The project relies on data bases linked to some original phenomena as:
The project is separated into three subprojects :
|