Monday, December 8, 2014

Lab 12: Hyperspectral Remote Sensing

Goals

The goal of this lab is to develop knowledge on how to properly process and identify features within hyperspectral remotely sensed data. Particularly this lab taught me how to detect noise in hyperspectral data and remove the spectral channels which have excess noise and how to detect target features within hyperspectral images using reflectance spectra.

Methods

In the first portion of the lab I used the spectral analysis workstation in ERDAS Imagine. However, before utilizing this tool I detected anomaly within the hyperspectral data provided for this lab. To perform this type of detection I used the anomaly detection function in ERDAS Imagine. Anomaly detection can be defined as the process of searching an input image in order to identify pixels which have spectral signatures which are greatly different than most of the other pixels in the image. Put simply, it asks the question "is there anything unusual in the image?". The first step in this process after opening the anomaly detection wizard was to select the input image I was given in the dialog which uses the option of image only. Next, I kept the threshold number at the default and ran the wizard selecting the option to create an output file and proceed to the workstation. The workstation I am referring to is the spectral analysis workstation. Once the processing of the anomaly detection is complete the spectral analysis workstation will open and the anomaly mask will be displayed (Fig. 1). There are some white areas which appear among the black background, to adjust this I selected the "swipe" tool from the menu and moved the swipe position in order to see its effect on the image. In summary, the main point of conducting the anomaly detection is to help and identify bad bands which should be removed from the analysis. Identification of bad bands is very important because of the very large number of bands used in hyperspectral remote sensing. 


(Fig. 1) The output image of the anomaly detection can be shown above. The regions which are shown in white are where the anomalies are present within the input image.

Since there are so many bands collected some datasets could have been corrupted based on the absorption of particularly wavelengths because of issues with the sensor or atmospheric distortion. If these bad bands are included in the metric algorithms the calculations it creates can be incorrect. To determine which are the bad bands, I ran the anomaly detection wizard again selecting the option of "bad band selection tool" which then opened a display in the spectral analysis workspace. Within the bad band selection tool was a preview of the image, the data histogram and the mean plot of the selected bands. I selected the bands which were provided for us by my professor and classified them as "bad bands" (Fig. 2). Once all the bands were selected I ran the program and opened the new output image in the spectral analysis workstation (Fig. 3). 



(Fig. 2) The bad band selection tool allows the user to select some of the bands within the 224 in this particular hyperspectral image which should not be used in the output image analysis.



(Fig. 3) The output image of the anomaly detection after removing the bad bands from the original image shows a greater amount of anomaly compared to Fig. 1. 

The next process I conducted on hyperspectral images was target detection. I first used the simple target detection method and then target detection using spectral libraries. Target detection is a process which searches a hyperspectral image for a specific material (or target) which is thought to be only present in low amounts. Using the target detection wizard, I created a new project and then within the target detection wizard selected the target spectrum selection process. In the simple target detection method I inputted a spectral library provided to me through this lab. For the target detection where I used spectral libraries I used data from the USGS spectral library to yield my results. In this process however, it was a bit more of a complex method as I needed to view the sensor information tool to make sure that the spectral library data matched that of my image. I also excluded the bad bands from my final output by using the same process as shown above. 

Results

By using these methods in hyperspectral remote sensing, there were different results. For instance, the anomaly detection using the bad band exclusion was much more effective at detecting anomalies within the hyperspectral image compared to the other method of anomaly detection which used all the bands. As for the target detection methods both the simple and spectral library methods proved to yield equally accurate data.

Sources

All the data used in this lab exercise was from ERDAS Imagine 2010.

Tuesday, December 2, 2014

Lab 11: Lidar Remote Sensing

Goals

Lidar is one of the most rapidly expanding areas of remote sensing which is also causing a great deal of growth in the job market. The main goal of this lab exercise was to use Lidar data for various aspects of remote sensing. The specific objectives include the processing of surface and terrain models, creating intensity images and other derivative products from point clouds and the use of Lidar derivative products as ancillary data in order to improve optical remotely sensed data image classification. 

Methods

For this particular lab exercise I was placed in a real-world scenario in order to apply my conceptual knowledge of Lidar data to a portion of the City of Eau Claire. This scenario states that I am to act as a GIS manager working on a project for the City of Eau Claire where I have acquired Lidar point could in LAS format for a portion of the city. I need to first initiate a quality check on the data by viewing its coverage and area while also studying the current classification of the Lidar data. My tasks are as follows: create an LAS database, explore the properties of LAS datasets and visualize the LAS dataset as point clouds in both 2D and 3D formats. For majority of this lab I will use ArcMap rather than ERDAS Imagine. 

To start, I created a new LAS dataset within my Lab 11 folder. After this dataset was created I opened the properties in order to add files to this dataset. After adding all the files provided to me for this lab I selected the "statistics" tab within the properties and selected the option to "calculate" which will build statistics for the LAS dataset. Once the statistics were added, I could then look at the statistics for each individual LAS file. These statistics can be used for OA/OC (quality assurance/ quality control of the individual LAS files as well as the dataset as a whole. An easy way to check the OA/ OC is to compare the Max Z and Min Z values which are the known elevations within the range of the Eau Claire study area. The next step is to assign the coordinate information to the LAS dataset. To do this I clicked on the “XY Coordinate System” tab. Since the data had no assigned coordinate system I had to look at the metadata to determine the horizontal and vertical coordinate systems for the data (Fig. 2). Once I applied the coordinate system to the LAS dataset, I opened it in ArcMap. I then added a shapefile of Eau Claire County to the file in order make sure that the data is spatially located correctly. Next I zoomed into the tiles in order to visualize the point clouds in elevation form (Fig. 3).



(Fig. 1) The red tiled area is the region of Eau Claire county where I will be working with Lidar data.


(Fig. 2) The metadata for the various LAS data files can be used to determine the horizontal and vertical coordinate systems used for these images.


(Fig. 3) This image shows a zoomed in view of the red tiled shown in Fig. 1 and shows the Lidar data.

Digital surface models (DSMs) produce Lidar data which can be used as ancillary data to improve on classification within an expert system classifier. I then could add contour lines to the data by selecting the symbology tab within the layer properties. I then could change the index factor in order to experiment with how the values effected the contours within the display. 

I could then explore the point clouds according to class, return and profile. To do this I zoomed out to the full extent of the study area and set the points to "elevation" and the filter to "first return". Using this  method I could drag a line over a bridge feature on the map and see a 2D illustration of the feature's elevation. 

The next objective for this lab was to generate Lidar derivative products. The first step in this process was to derive DSM and DTM products from the point clouds. In order to figure out what the spatial resolution which the derivative products should be produced at I had to estimate the average NPS (nominal pulse spacing) at which the point clouds where initially collected at. This information can be found in the LAS dataset properties menu under the "point spacing" region of the LAS file information. 

I then set up a geoprocessing workspace in order to create raster derivative products at a spatial resolution of 2 meters. The next step was to open the toolbox and select: "conversion tools> to raster> LAS dataset to raster". After inputting the LAS dataset I set the value field to "elevation". I then used the binning interpolation method and set the cell type to maximum and void filling to natural neighbor. Once the tool is finished running I opened the DSM result into ArcMap. The DSM file can then be used as ancillary data which can be used to classify buildings and forest, both of which are structures above the ground surface. Using the 3D analyst tool hillshade, the derived raster was added to my map. 


(Fig. 4) The output image shown above is the derivative product, DSM result.

Next I derived a DTM (digital terrain model) from the lidar point cloud. I used the LAS dataset toolbar, setting the filter to ground in order to make sure the point tool shows the points which are colored based on elevation. I set the interpolation to binning, cell assignment type to minimum, void fill method to natural neighbor and sampling type as cellsize. After the tool was run I opened it in ArcMap and can view the derivative product which resulted from the tool.

I then derived a Lidar intensity image from the point cloud which requires a similar process to creating the DSM and DTMs which were explained above. This time however, the value field will be set to intensity, the binning cell assignment type to average, and void fill natural neighbor. Once this tool finished running I opened the output image in ERDAS Imagine.


(Fig. 5) Lidar intensity image produced from the original point cloud image.


Results

Throughout this lab exercise I learned how to utilize Lidar data in remote sensing. The output images produced from this lab can be seen in the above method section. I both processed surface and terrain models and used Lidar derivative products as ancillary data in order to improve optical remotely sensed data image classification.

Sources

The Lidar point cloud and Tile Index data are from the Eau Claire County 2013 and the Eau Claire County shapefile is from Mastering ArcGIS 6th Edition data by Margaret Price 2014. All data was provided by Dr. Cyril Wilson of the University of Wisconsin- Eau Claire.