Regional gravity field modelling by means of remove-restore procedure is nowadays widely applied in different contexts, by geodesists and geophysicists: for instance, it is the most used technique for regional gravimetric geoid determination and it is used in exploration geophysics to predict grids of gravity anomalies. In the present work in addition to a review of the basic concepts of the classical remove-restore, some new algorithms to compute the so called terrain correction (required to reduce the observed gravitational signal), and to model the stochastic properties (in terms of covariance function) of the gravitational signal, required to grid sparse observations have been studied and implemented. Geodesists and geophysicists have been concerned with the computation of the vertical attraction due to the topographic masses, the so called Terrain Correction, for high precision geoid estimation and to isolate the gravitational effect of anomalous masses in geophysical exploration. The increasing resolution of recently developed digital terrain models, the increasing number of observation points due to extensive use of airborne gravimetry in geophysical exploration, and the increasing accuracy of gravity data introduce major challenges for the terrain correction computation. Moreover, classical methods such as prism or point masses approximations are indeed too slow, while Fourier based techniques are usually too approximate for the required accuracy. A new hybrid prism and FFT-based software, called GTE, which was thought explicitly for geophysical applications, was developed in order to compute the terrain corrections as accurate as prism and as fast as Fourier-based software. GTE does not only consider the effects of the topography and the bathymetry but also those due to sedimentary layers and/or to the Earth crust-mantle discontinuity (the so called Moho). After recalling the main classical algorithms for the computation of the terrain correction, the basic mathematical theory of the software and its practical implementation are explained. GTE showed high performances in computing accurate terrain corrections in a very short time with respect to GRAVSOFT and Tesseroids. The slowest GTE profiler has a superior performance in terms of computational time to compute the terrain effects on grids with constant heights, sparse points and on the surface of the provided digital elevation model than both of GRAVSOFT and Tesseroids. While, the fast profiler is able to give an overview with a standard deviation of the errors below the accuracy of the measurements, roughly in a time that is at least one order of magnitude less than the time required by the other software. A filtering procedure for the raw airborne gravity data based on a Wiener filter in the frequency domain that allows to exploit the information coming from all the collected data has been developed and tested too. During this filtering also biases and systematic errors potentially present in airborne data are corrected by means of GOCE satellite observations. A remove-like step, removing the low and high frequencies of the observation, is done in order to reduce the values of the signal to be filtered, which would be restored afterwards in a restore-like step, after filtering the data. The filtering step required almost 7 minutes to filter about 440:000 observations if the Residual Terrain Correction required to reduce the data is available and about 30 minutes if it has to be computed. Gridding the filtered data is done via applying a classical least squares collocation. An innovative idea that allows automatizing the estimation of the covariance matrix, is done by fitting an empirical 2D power spectral density with a series of Bessel functions of the first order and zero degree that assures to gain a positive definite covariance matrix. Finally, the estimation of the along track filtered noise is estimated through performing a cross-over analysis. The study of the expected noise allows to estimate a covariance function of the noise itself giving valuable information to be used (in future works) in the subsequent gridding step. In fact integrating the cross-over analysis within an iterative procedure of filtering and gridding would result in yielding a better grid estimation and noise prediction of airborne gravimetric data. All the above algorithms have been implemented in a suite of software modules developed in C and able to exploit parallel computation and tested on a real airborne survey. The results of these tests as well as the computational times required are also reported and discussed.
Regional gravity field modelling by means of remove-restore procedure is nowadays widely applied in different contexts, by geodesists and geophysicists: for instance, it is the most used technique for regional gravimetric geoid determination and it is used in exploration geophysics to predict grids of gravity anomalies. In the present work in addition to a review of the basic concepts of the classical remove-restore, some new algorithms to compute the so called terrain correction (required to reduce the observed gravitational signal), and to model the stochastic properties (in terms of covariance function) of the gravitational signal, required to grid sparse observations have been studied and implemented. Geodesists and geophysicists have been concerned with the computation of the vertical attraction due to the topographic masses, the so called Terrain Correction, for high precision geoid estimation and to isolate the gravitational effect of anomalous masses in geophysical exploration. The increasing resolution of recently developed digital terrain models, the increasing number of observation points due to extensive use of airborne gravimetry in geophysical exploration, and the increasing accuracy of gravity data introduce major challenges for the terrain correction computation. Moreover, classical methods such as prism or point masses approximations are indeed too slow, while Fourier based techniques are usually too approximate for the required accuracy. A new hybrid prism and FFT-based software, called GTE, which was thought explicitly for geophysical applications, was developed in order to compute the terrain corrections as accurate as prism and as fast as Fourier-based software. GTE does not only consider the effects of the topography and the bathymetry but also those due to sedimentary layers and/or to the Earth crust-mantle discontinuity (the so called Moho). After recalling the main classical algorithms for the computation of the terrain correction, the basic mathematical theory of the software and its practical implementation are explained. GTE showed high performances in computing accurate terrain corrections in a very short time with respect to GRAVSOFT and Tesseroids. The slowest GTE profiler has a superior performance in terms of computational time to compute the terrain effects on grids with constant heights, sparse points and on the surface of the provided digital elevation model than both of GRAVSOFT and Tesseroids. While, the fast profiler is able to give an overview with a standard deviation of the errors below the accuracy of the measurements, roughly in a time that is at least one order of magnitude less than the time required by the other software. A filtering procedure for the raw airborne gravity data based on a Wiener filter in the frequency domain that allows to exploit the information coming from all the collected data has been developed and tested too. During this filtering also biases and systematic errors potentially present in airborne data are corrected by means of GOCE satellite observations. A remove-like step, removing the low and high frequencies of the observation, is done in order to reduce the values of the signal to be filtered, which would be restored afterwards in a restore-like step, after filtering the data. The filtering step required almost 7 minutes to filter about 440:000 observations if the Residual Terrain Correction required to reduce the data is available and about 30 minutes if it has to be computed. Gridding the filtered data is done via applying a classical least squares collocation. An innovative idea that allows automatizing the estimation of the covariance matrix, is done by fitting an empirical 2D power spectral density with a series of Bessel functions of the first order and zero degree that assures to gain a positive definite covariance matrix. Finally, the estimation of the along track filtered noise is estimated through performing a cross-over analysis. The study of the expected noise allows to estimate a covariance function of the noise itself giving valuable information to be used (in future works) in the subsequent gridding step. In fact integrating the cross-over analysis within an iterative procedure of filtering and gridding would result in yielding a better grid estimation and noise prediction of airborne gravimetric data. All the above algorithms have been implemented in a suite of software modules developed in C and able to exploit parallel computation and tested on a real airborne survey. The results of these tests as well as the computational times required are also reported and discussed.
Airborne gravity field modelling
HAMDI HEMIDA MAHMOUD MANSI, AHMED
Abstract
Regional gravity field modelling by means of remove-restore procedure is nowadays widely applied in different contexts, by geodesists and geophysicists: for instance, it is the most used technique for regional gravimetric geoid determination and it is used in exploration geophysics to predict grids of gravity anomalies. In the present work in addition to a review of the basic concepts of the classical remove-restore, some new algorithms to compute the so called terrain correction (required to reduce the observed gravitational signal), and to model the stochastic properties (in terms of covariance function) of the gravitational signal, required to grid sparse observations have been studied and implemented. Geodesists and geophysicists have been concerned with the computation of the vertical attraction due to the topographic masses, the so called Terrain Correction, for high precision geoid estimation and to isolate the gravitational effect of anomalous masses in geophysical exploration. The increasing resolution of recently developed digital terrain models, the increasing number of observation points due to extensive use of airborne gravimetry in geophysical exploration, and the increasing accuracy of gravity data introduce major challenges for the terrain correction computation. Moreover, classical methods such as prism or point masses approximations are indeed too slow, while Fourier based techniques are usually too approximate for the required accuracy. A new hybrid prism and FFT-based software, called GTE, which was thought explicitly for geophysical applications, was developed in order to compute the terrain corrections as accurate as prism and as fast as Fourier-based software. GTE does not only consider the effects of the topography and the bathymetry but also those due to sedimentary layers and/or to the Earth crust-mantle discontinuity (the so called Moho). After recalling the main classical algorithms for the computation of the terrain correction, the basic mathematical theory of the software and its practical implementation are explained. GTE showed high performances in computing accurate terrain corrections in a very short time with respect to GRAVSOFT and Tesseroids. The slowest GTE profiler has a superior performance in terms of computational time to compute the terrain effects on grids with constant heights, sparse points and on the surface of the provided digital elevation model than both of GRAVSOFT and Tesseroids. While, the fast profiler is able to give an overview with a standard deviation of the errors below the accuracy of the measurements, roughly in a time that is at least one order of magnitude less than the time required by the other software. A filtering procedure for the raw airborne gravity data based on a Wiener filter in the frequency domain that allows to exploit the information coming from all the collected data has been developed and tested too. During this filtering also biases and systematic errors potentially present in airborne data are corrected by means of GOCE satellite observations. A remove-like step, removing the low and high frequencies of the observation, is done in order to reduce the values of the signal to be filtered, which would be restored afterwards in a restore-like step, after filtering the data. The filtering step required almost 7 minutes to filter about 440:000 observations if the Residual Terrain Correction required to reduce the data is available and about 30 minutes if it has to be computed. Gridding the filtered data is done via applying a classical least squares collocation. An innovative idea that allows automatizing the estimation of the covariance matrix, is done by fitting an empirical 2D power spectral density with a series of Bessel functions of the first order and zero degree that assures to gain a positive definite covariance matrix. Finally, the estimation of the along track filtered noise is estimated through performing a cross-over analysis. The study of the expected noise allows to estimate a covariance function of the noise itself giving valuable information to be used (in future works) in the subsequent gridding step. In fact integrating the cross-over analysis within an iterative procedure of filtering and gridding would result in yielding a better grid estimation and noise prediction of airborne gravimetric data. All the above algorithms have been implemented in a suite of software modules developed in C and able to exploit parallel computation and tested on a real airborne survey. The results of these tests as well as the computational times required are also reported and discussed.File | Dimensione | Formato | |
---|---|---|---|
AIRBORNE_GRAVITY_FIELD_MODELLING.pdf
Open Access dal 27/01/2017
Descrizione: Phd Dissertation Titled "Airborne Gravity Field Modelling", by Ahmed Hamdi Hemida Mahmoud Mansi, Cycle 28, School Of Environmental And Infrastructural Engineering, Geomatics Area, Politecnico Di Milano
Dimensione
10.2 MB
Formato
Adobe PDF
|
10.2 MB | Adobe PDF | Visualizza/Apri |
I documenti in POLITesi sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/10589/116547