Extracting Water Bodies using Remote Sensing
Arpit Shah
Mapping Workflows for Supply Chain and Operations Improvement - Mapmyops.com | Intelloc Mapping Services, Kolkata
This article was originally published on Mapmyops' website and updates, if any, are made there. Hence, I'd recommend you to read this article from the website itself: https://www.mapmyops.com/mapping-water-bodies-using-optical-and-radar-satellite-imagery
________________________________________________________________
Those who have read my pre-2022 articles on Imagery Analytics or any similar news articles online often tend to see the final output and the writer's interpretation of it. Deriving the output from raw imagery and the procedure involved is generally, hidden from view. After all, it makes more sense to watch a piece of art and admire its beauty or spot its flaws rather than watching an artist painstakingly prepare it. However, if you have seen those soothing, fast-forwarded artsy videos on social media, you'll know that there is a lot of joy to be derived from watching the process of creating art as well.
There is much to like about analyzing satellite imagery. Much of it is because it is 'explorative' in nature - like an archaeological expedition, one keeps on digging to uncover layers of artefacts along the way.
In this article, we will see the process involved in satellite imagery analytics using a case of 'Water body mapping'. We will analyze two types of Satellite imagery - Optical and Radar. Along this exhaustive journey, you would get an idea of what geotechnology has to offer in this space and the various methods to extract meaningful information from satellite imagery.
________________________________________________________________
Section Hyperlinks below:
________________________________________________________________
Optical satellite imagery looks much like a normal photograph. Information is captured by the satellite's receiver using the visible light emitted by passive light sources (mainly sunlight) which hits and is reflected from the earth's surface. Radar satellite imagery, on the other hand, is formed from the reflectance of radio waves (not visible to human eye) which are emitted by an active light source - the transmitter onboard the satellite itself.
Notice from the infographic in Figure 1 above, that the radio waves can penetrate the earth's atmosphere just like visible light.
Both optical and radar imagery have their own set of features, advantages and disadvantages.
In this exercise, we will map the waterbodies in Masuria, North Poland. This area is famous for its Lake District (> 2000 lakes within). Mapping water bodies is necessary for multiple purposes and is part of numerous geo-workflows such as flood mapping, water level mapping, tidal effects, changes in river course, glacial movement etc.
Just imagine how difficult would it be to manually plot the contours of a water body with coordinate level precision from just an image or via a cartographic survey. Satellite Imagery Analytics, in comparison, helps save a lot of time and effort.
Before we analyze satellite images, we need to acquire the images. Imagery acquisition requires a little knowhow. Without getting into technicalities (see first few sections of this video to know more), one needs to know the desired imagery features, resolution, timeline of image(s), where and how to acquire it and, of course, cost of imagery acquisition plays an important role too. In this exercise I have used optical and radar imagery acquired by EUs Copernicus Earth Observation satellites. These are free and openly accessible to registered users. ?
________________________________________________________________
Mapping Water Bodies using Optical (Satellite) Imagery
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
________________________________________________________________
Normalized Difference Water Index (NDWI) is the analysis method (developed by McFeeters in 1996) which we will use to detect, amplify and demarcate water bodies.
The crux of the formula above is simple to understand although the technical aspect - how different types of electromagnetic waves interact with different types of objects - is indeed, complex.
Sunlight, the primary source of illumination in optical imagery, is composed of visible light (which we can see) as well as near infrared (NIR) and short-wave infrared (SWIR) waves (which we can't see).
Water appears relatively brighter in Visible Light than it does in NIR and SWIR light where it appears very dark. Soil & Vegetation, in contrast, appears darker in visible light but brighten up significantly when exposed to NIR and SWIR light.
This is because different objects have different reflectivity values when exposed to different types of electromagnetic waves (see image below). In our analysis, we use the NDWI formula because it exploits this principle to good effect and helps us to ascertain whether the pixel under review is a Water pixel or not.
The NDWI formula determines whether the Near Infrared value of a pixel exceeds the Visible Light value of that pixel or not. If it does, then both the numerator and resulting output will turn negative, implying it being a Land pixel. On the other hand, if Visible Light value exceeds the Near InfraRed value then both the numerator and output will remain positive, implying it being a Water pixel. The fascinating aspect is that, if a pixel is low on Visible Light value then invariably it will be high on NIR value as water has contrasting reflectance characteristics compared to soil and vegetation. Thus, this ratio formula amplifies the dominating property of the surface captured in the pixel.
NDWI has its flaws - For example, Urban infrastructure can be misclassified as water. There are more sophisticated detection techniques available to detect water bodies more precisely.
We will save this mask / layer of information and will use it to visualize and compare it with the Radar imagery output towards the end of this article.
________________________________________________________________
Mapping Water Bodies using Radar (Satellite) Imagery
One of the most important aspects of radar imagery is that neither is it impacted by the earth's atmosphere, nor does it require an external source of light (sunlight) to capture information. Radar illumination generated by the satellite's transmitter can penetrate through clouds as well as capture surface property information with precision even at night. This renders it to be more useful in certain situations where optical imagery is either insufficient or ineffective. The way to process a radar imagery is also significantly different. Let's begin.
While we were able to extract the water mask from just one Optical imagery set, we will use five Radar imagery sets (acquired between October and December 2020) to extract the Water mask. We will delve into the reason behind this later in this article.
This is how a radar imagery over the same area (Masurian Lake District) looks like -
________________________________________________________________
_____________________________________________________________
________________________________________________________________
领英推荐
________________________________________________________________
The Terrain Corrected image resembles what we see on a regular 2D map in terms of a) distance between objects as well as b) Arctic Circle being the true north. Originally, the image was captured the way the satellite captured the information in its orbit i.e. in radar geometry.?
________________________________________________________________
Each of the processing steps above is time-consuming. To do these same steps on all the five images will take considerable time and increases chances of human error. To automate this process and reduce the time taken, the imagery analytics software has high utility features known as Graph Builder and Batch Processing. Essentially, we will plot the processing steps in chronological order here, set the parameters, and run it with a click of a button on all the 5 imagery datasets together. (For this to work, all the images need to have common processing parameters which in our case, happens to be just so.)
________________________________________________________________
The last preprocessing step is Coregistration. Essentially, for the software each image is a separate entity and it can't compare and analyze pixel values from the same coordinates from different images. Therefore, we coregister the images i.e. form a stack of the five images in a single product so that the software knows that all the images are identical in terms of geographic extent and same pixel location across images can be analyzed in an 'apple to apple' manner.
________________________________________________________________
I suppose Albert Einstein had once famously remarked: 'If I had an hour to solve a problem I'd spend 55 minutes thinking about the problem and 5 minutes thinking about the solution.'
I find this to be applicable to Satellite imagery workflows as well. The actual analysis portion takes little time but the processing methodology and steps involved are time-consuming and need to be done in an error-free manner to ensure that the data is set up in a right manner for analysis and interpretation to be effective.
Isn't this useful?
These kind of temporal images are useful to see where, for example, agricultural harvesting has happened as the water bodies tend to remain the same across the timelines (although important changes like flooding, course change and water level changes can be captured using this method as well).
________________________________________________________________
Why do we do that? It is akin to balancing out the sour taste in a food dish by adding sugar to it. The base images have inherent noises which lead to variation in data values. By mean averaging, we balance out these variations so that the noise is tempered and we get a more smoother image. You may not be able to discern the smoothness above, but in full screen mode in the software, this effect can be observed very clearly. This is why we had opted for multitemporal analysis using five radar images - it helps us to balance out any noise and outliers in the pixel data values which aids us in correct analysis and interpretation.
________________________________________________________________
Just see the histogram output of the pixel intensity (image to your left) - there is very little difference in the pixel data values of a black pixel to a white pixel in our Mean averaged image - the values are so tiny and clustered together.
How can we threshold / classify the data, then?
________________________________________________________________
Essentially, we have exponentially amplified the pixel colors by converting the linear pixel data values into logarithmic data.
________________________________________________________________
Logarithms are useful after all!
________________________________________________________________?
________________________________________________________________
The final step is to compare both the water masks in a Geographic Information System.
The optical imagery over the area of our study has been acquired in spring, whereas the radar imagery has been acquired in winter months. Therefore, there is bound to be certain changes in landscape which account for the difference. Also, since the method of data acquisition and processing is different for optical and radar, small deviations in output is expected as well.
A Powerpoint presentation comparing both the the imagery types (optical and radar) can be found here.?
________________________________________________________________
Final output: In the slider above, you can compare our Water mask output overlayed on Google basemap imagery. (The slider is not visible on LinkedIn Articles - view it from this hyperlink)
Isn't our Water body detection accurate? As you review this output, I hope you had fun knowing the process involved in satellite imagery analytics as I had indicated in the first paragraph of this article.
Thank you for reading!
________________________________________________________________?
ABOUT US
Intelloc Mapping Services | Mapmyops is engaged in providing mapping solutions to organizations which facilitate operations improvement, planning & monitoring workflows. These include but are not limited to Supply Chain Design Consulting, Drone Solutions, Location Analytics & GIS Applications, Site Characterization, Remote Sensing, Security & Intelligence Infrastructure, & Polluted Water Treatment. Projects can be conducted pan-India and overseas.
Several demonstrations for these workflows are documented on our website. For your business requirements, reach out to us via email - [email protected] or book a paid consultation (video meet) from the hyperlink placed at the footer of the website's landing page.
Regards, Arpit Shah
Much Thanks to EU Copernicus, RUS Copernicus & QGIS for the training material and software. A video output using older imagery can be viewed from here.