Explore chapters and articles related to this topic
Theories of Synthetic Aperture Radar
Published in Maged Marghany, Automatic Detection Algorithms of Oil Spill in Radar Images, 2019
Imaging radar is categorized further into Real Aperture Radar (RAR) and Synthetic Aperture Radar (SAR). In this view, RAR transmits a narrow-angle beam of the pulse radio wave in the range direction at right angles to the flight direction, which is known as the azimuth direction (Fig. 6.5). RAR receives the backscattering from the objects, which is transformed into a radar image from the received signals. Figure 6.5 reveals that the strip of terrain to be imaged is from point A to point B. Point A being nearest to the nadir point is said to lie at near range and point B, being the furthest, is said to lie at far range. The distance between A and B defines the swath width. The distance between any point within the swath and the radar is called its slant range. The ground range for any point within the swath is its distance from the nadir point (a point on the ground directly underneath the radar) [89].
Sensors and Instruments
Published in Julio Sanchez, Maria P. Canton, William Perrizo, Space Image Processing, 2018
Julio Sanchez, Maria P. Canton, William Perrizo
Figure 4.13 describes some of the terminology used in radar imaging. The aircraft or spacecraft moves above the terrain in an azimuthal direction. The radar pulse spreads outward in the range in what is called the look direction. A line-of-sight from the radar to a ground point defines the slant range to that point. The distance between a point directly below the aircraft or spacecraft (called the nadir) and a point in the ground target point is called the ground range. The point on the target which is closest to the flight line is the near range limit, while the point farthest away from the flight path is called the far range. The angle between a horizontal plane and a given slant range direction is the depression angle for the point along the line. The complementary angle, measured from a vertical plane, is referred to as the look angle.
Using Sentinel-1 and Google Earth Engine cloud computing for detecting historical flood hazards in tropical urban regions: a case of Dar es Salaam
Published in Geomatics, Natural Hazards and Risk, 2023
Biadgilgn Demissie, Sabine Vanhuysse, Tais Grippa, Charlotte Flasse, Eleonore Wolff
Terrain correction is required to solve topographic effects on the SAR backscatter (Veloso et al. 2017). The terrain correction is needed because of the side looking of SAR systems. Every target located on the terrain being observed by the radar is mapped onto the slant range domain (Adiri et al. 2017; Filgueiras et al. 2019). Even though the Sentinel-1 images available in GEE are corrected for geometric distortions, radiometric terrain normalization was required. Radiometric terrain normalization masks pixels in the active layover and shadow area in the image (Veloso et al. 2017; Vollrath et al. 2020; Mullissa et al. 2021). For radiometric terrain normalization, two angular-based correction methods are available based on volume and surface scattering (Vollrath et al. 2020). The volume scattering model is important for applications related to vegetation mapping. For urban application, as is in our case, the angular-based surface scattering model is most appropriate (Vollrath et al. 2020). We implemented surface model for radiometric terrain normalization in GEE.
Integration of SAR and multi-spectral imagery in flood inundation mapping – a case study on Kerala floods 2018
Published in ISH Journal of Hydraulic Engineering, 2022
Jesudasan Jacinth Jennifer, Subbarayan Saravanan, Devanantham Abijith
The SAR calibration facilitates the interpretation of the pixel values of the image directly to the radar backscatter of the scene. However, it is necessary to apply, radiometric correction in prior, to avoid the intervention of radiometric bias. A calibration vector is included as an annotation in the product allowing simple conversion of image intensity values into sigma, beta, or gamma nought values. Sigma nought is the known as the scattering coefficient which is a conventional measure of the strength of radar signals reflected by a distributed scatterer. Beta nought is also known as radar brightness coefficient, which is the reflectivity per unit area in slant range. Gamma nought is obtained by normalizing the sigma nought values with respect to the incidence angle by removing some of the range-dependency. Here, the intensity image of the VV polarisation band of SAR data product was converted into sigma nought values.
Subsidence of sinkholes in Wink, Texas from 2007 to 2011 detected by time-series InSAR analysis
Published in Geomatics, Natural Hazards and Risk, 2019
Yun Shi, Yaming Tang, Zhong Lu, Jin-Woo Kim, Junhuan Peng
where B is the length of the baseline, α is the baseline orientation angle, λ is the radar wavelength, θ is the look angle, r is the slant-range from the target to the reference satellite, herr is the topography height error, and φdef, φatm and φnoise are, respectively, the components of deformation, atmospheric and noise in the interferogram phase. InSAR has been widely applied to various studies of geohazards, for example, landslides, earthquakes, volcanoes, and land subsidence (e.g. Massonnet et al. 1993; Amelung et al. 1999; Bawden et al. 2001; Lu and Danskin 2001; Lu et al. 2010; Lu and Dzurisin 2014). Decorrelation in space and time between the two consecutive SAR acquisitions can, however, affect the robustness of the results (Zebker and Villasenor 1992). Artefacts due to atmospheric and orbital errors could significantly degrade the accuracy of measurement (e.g. Ferretti et al. 2001; Li et al. 2005). Multi-interferogram techniques, including SBAS (Small Baseline Subset) InSAR, PSInSAR (Persistent Scatterers InSAR), and SqueeSAR, have been proposed to overcome these problems and retrieve time-series deformation histories (e.g. Ferretti et al. 2001, 2011; Hooper et al. 2004, 2007; Hooper and Zebker 2007; Hooper 2008; Lu and Zhang 2014; Qu et al. 2015).