State of the artDue to the specificity of the data, SAR processing requires both specific methodological tools and thematic ones in order to provide pertinent information to Earth specialists. The sensors acquire complex data (a pixel is described by a complex number) and a specific step (SAR processing) is needed to obtain an image. More, the hugeness of some databases needs specific platforms including archival storage and data processing. At present time, the processing steps seem to be rather well known so that some commercial softwares (Earth view from Vexcel [Earthview], Diapason V4 from Altamira [Diapason], GAMMA RS platform [Gamma], ...) and open source softwares (ROI_PAC [ROI_PAC], RAT [RAT]...) are available. These global platforms, dealing mainly with interferometric processing, provide globally convincing results. Yet, for thematic applications, existing processing tools have defaults. The commercial softwares are "black box" preventing the specialists to fine tuning the parameters. In these softwares, it is impossible to improve processing by adding specific steps dedicated to their thematic. In the case of open source software, complementary tools have to be written in order to deal with the specific applications. EFIDIR aims to create these complementary tools in a global harmonized open source platform. In the project, those developments will serve for three natural phenomena which are not yet properly imaged at present time in most cases: the landslide small amplitude ground motion, the glaciers evolution, and surface volcanoes displacements. All three are typical cases for which extracting useful information requires processing larger data sets. In the nineties, conventional synthetic aperture radar interferometry has been widely used to map ground deformation induced by large events, such as earthquakes (co-seismic and post-seismic deformation, Massonnet et al., 1993, 1994) or volcanic eruptions (Massonnet et al. 1995). More efforts were then put into the improvement of data inversion methods and geophysical modelling, to better recover the displacement field at the source of the events (slip distribution on a fault during an earthquake, Pedersen et al 20...). For example, the correlation level can be used as confidence measure to assess the reliability of the measured displacements, or non-uniform opening of a magmatic fissure, (Amelung et al. 2000, Fukushima et al. 2005). However, most past studies focused on large (several tens of centimetres or more) displacements measurements and analysis. In order to make significant improvements in our knowledge of the lithosphere behaviour there are crucial needs in detecting small and slow signals (a few mm per years during tens of years), even with large spatial wavelengths (typically such as the deformation induced by silent earthquakes, Lowry et al. 2001), and to follow their spatial and temporal variations. In volcanic areas, characterizing the temporal evolution of the deformation is the only way to constrain the pressure conditions inside the magmatic system and to have predictive information on future eruptions. In a tectonic context, it constrains stress loading models. Despite the huge archive of SAR data, pre-eruptive deformation studies remain mainly based on sparse GPS data (Lowry et al 2001, Sturkell et al 2006) and interseismic studies along faults are most often limited to the characterization of a "mean behaviour" of the faults (Socquet et al., 2007). The main two limitations to overcome in detecting small signals using SAR data are the spatial and temporal decorrelation, and the tricky step of atmospheric phase screen removing. In order to remove the atmospheric phase screen, Ferretti [Ferretti et al. 1999] proposed an original process by examining a stack of differential interferograms and looking at the temporal evolution of phase-stable pixels. This technique was called the "Permanent Scatterer Technique". In the PS analysis, the selection of the PS is based on an analysis of the pixel amplitude. A similar approach is based on criteria based on coherence to select stable pixels: this technique, proposed by Berardino [Berardino et al. 2002] defines "Coherent Scatterers". Yet, these techniques have to be adapted to match with, for example, the EFIDIR glacier applications in a specific context: a high mountainous area. Another aim of EFIDIR project is to deal with polarimetric SAR data. This rather new SAR modality increases the volume of initial data-sets, but, if correctly processed, provides complementary information useful to reach the sought-after knowledge. Polarimetric SAR data techniques have been widely addressed in the last decade. Several approaches were derived to directly relate some basic characteristics of the targets to elements of the polarimetric covariance matrix [Boer-98] [Vanz-89]. More recently, polarimetric decomposition theorems were introduced in order to investigate the intrinsic physical properties of a natural medium by evaluating the underlying scattering mechanisms [Free-98] [Clou-97] [Clou-95]. All these approaches realize an interpretation of the polarization of the backscattered wave and establish a relation between the medium physical properties and polarimetric transformations. The tight relation between natural media physical properties and their polarimetric features leads to highly descriptive classifications results that can be interpreted by analyzing underlying scattering mechanisms. Interferometric data provide information concerning the coherence of the scattering mechanisms and can be used to retrieve observed media structures and complexity [Papa-99] [Clou-98] [Papa-01] [Ferr-05]. Recently, first polinSAR measurements acquired over alpine glacier at L and P bands have been acquired [Steb 05]. In order to facilitate the accessibility and exploitation of multi-polarised SAR datasets, the European Space Agency has funded a specific software tool: POLSARPRO (Polarimetric SAR Data Processing and Educational Tool). POLSARPRO is developed under contract to ESA by a consortium comprising IETR at the University of Rennes 1, The Microwaves and Radar Institute (HR) of DLR and AEL Consultants, together with Dr Mark Williams. Yet, the specificity of wave propagation and interaction inside the ice of temperate glaciers requires both specific experiments and the development of new processing tools. A last important step deals with the final data fusion methods required to combine different information sources. Indeed, source's uncertainty (in a large sense) includes three main aspects: the measurement variability due to the influence of randomly varying variables, the measurement imprecision due to systematic bias of parameters involved in the measurement derivations, the measurement reliability related to the working conditions of the measurement devices. Probability theory is adequate for representing variability, standard interval analysis is adapted to imprecision processing and confidence measure for dealing with reliability. None of these approaches can deal with the full uncertainty picture. However, in order to provide a unified representation of uncertainty, the fuzzy/possibility theory provides an interesting framework [Dubois 2006]. Indeed, fuzzy intervals calculus generalize interval calculus and thus allow to model systematic unknown effects such as offset in the differential interferometry measurement, and allows in addition to deals with gradual imprecision that occurs often for expert knowledge. Moreover, a possibility distribution can represent a family of probability distributions, e.g. the family of continuous unimodal distributions with specified mode and support [Mauris et al. 2001, Dubois et al. 2004]. This point can be useful when only a quite poor statistical model of measurements is available (e.g. only the standard deviation is known and not the shape of the probability distribution) as it is often the case. Finally, thanks to the dual measure of the possibility, i.e. the necessity measure, it is possible to model the reliability of a source that depends upon its operating conditions. For example, the correlation level can be used as confidence measure to assess the reliability of the measured displacements. [Trouvé et al 1999]. In the thematic domain, data fusion methods need to be developed at least at two levels. The first one concerns the correction of tropospheric effect in differential interferometry. Nowadays data from dense permanent GPS network has, in theory, a strong potential for correcting atmospheric artefacts in the interferograms. Reciprocally, the detection of anomalies in interferograms, for examples along shorelines, is potentially a strong indicator of areas where GPS data need to be analyzed with more care, possibly including new strategies for taking into account lateral heterogeneities of the troposphere. The same question of fusion of space and ground data for corrections troposphere effects is valid in mountainous areas where local climatic anomalies are frequent. The second domain of application of the fusion is at the stage of the comparison/validation of measurements of ground motions (assuming the troposphere properly corrected). Although the natural complementarity of interferometry and GPS is well understood and recognized, very few papers present reliable and convincing comparisons yet. This is partly due to the lack of global strategy to acquire both categories of data in the most efficient way. On small objects like volcanoes, glaciers, landslides, the small size make it somewhat easier to plan a global observational strategy, despite the problems due to the topography, in particular the limitations of access to the field. Another important element to be taken into account, in the future task to be performed in fusion, is the question of routine and quick delivery of results, important for the monitoring of hazardous areas during crisis. The EFIDIR platform will take into account this last step, achieving by this way an effective processing of the SAR data and providing selected useful information to the Earth specialists. |