|

Classification of Hyperspectral Remote Earth Sensing Data using Combined 3D--2D Convolutional Neural Networks

Authors: Nyan L.T., Gavrilov A.I., Do M.T. Published: 30.03.2022
Published in issue: #1(138)/2022  
DOI: 10.18698/0236-3933-2022-1-100-118

 
Category: Informatics, Computer Engineering and Control | Chapter: System Analysis, Control, and Information Processing  
Keywords: deep learning, convolutional neural networks, hyperspectral image classification

Abstract

Hyperspectral image classification is used for analyzing remote Earth sensing data. Convolutional neural network is one of the most commonly used methods for processing visual data based on deep learning. The article considers the proposed hybrid 3D--2D spectral convolutional neural network for hyperspectral image classification. At the initial stage, a simple combined trained deep learning model was proposed, which was constructed by combining 2D and 3D convolutional neural networks to extract deeper spatial-spectral features with fewer 3D--2D convolutions. The 3D network facilitates the joint spatial-spectral representation of objects from a stack of spectral bands. Functions of 3D--2D convolutional neural networks were used for classifying hyperspectral images. The algorithm of the method of principal components is applied to reduce the dimension. Hyperspectral image classification experiments were performed on Indian Pines, University of Pavia and Salinas Scene remote sensing datasets. The first layer of the feature map is used as input for subsequent layers in predicting final labels for each hyperspectral pixel. The proposed method not only includes the benefits of advanced feature extraction from convolutional neural networks, but also makes full use of spectral and spatial information. The effectiveness of the proposed method was tested on three reference data sets. The results show that a multifunctional learning system based on such networks significantly improves classification accuracy (more than 99 %)

Please cite this article in English as:

Nyan L.T., Gavrilov A.I., Do M.T. Classification of hyperspectral remote Earth sensing data using combined 3D--2D convolutional neural networks. Herald of the Bauman Moscow State Technical University, Series Instrument Engineering, 2022, no. 1 (138), pp. 100--118 (in Russ.). DOI: https://doi.org/10.18698/0236-3933-2022-1-100-118

References

[1] Camps-Valls G., Tuia D., Bruzzone L., et al. Advances in hyperspectral image classification: Earth monitoring with statistical learning methods. IEEE Signal Process. Mag., 2014, vol. 31, iss. 1, pp. 45--54. DOI: https://doi.org/10.1109/MSP.2013.2279179

[2] Tun N.L., Gavrilov A., Tun N.M., et al. Hyperspectral remote sensing images classification using fully convolutional neural network. IEEE ElConRus, 2021. DOI: https://doi.org/10.1109/ElConRus51938.2021.9396673

[3] Liang H., Li Q. Hyperspectral imagery classification using sparse representations of convolutional neural network features. Remote Sens., 2016, vol. 8, no. 2, art. 99. DOI: https://doi.org/10.3390/rs8020099

[4] Tun N.L., Gavrilov A., Tun N.M., et al. Remote sensing data classification using a hybrid pre-trained VGG16 CNN-SVM classifier. IEEE ElConRus, 2021. DOI: https://doi.org/10.1109/ElConRus51938.2021.9396706

[5] Tun N.L., Gavrilov A., Tun N.M. Multi-classification of satellite imagery using fully convolutional neural network. IEEE ICIEAM, 2020. DOI: https://doi.org/10.1109/ICIEAM48468.2020.9111928

[6] Liu S., Luo H., Tu Y., et al. Wide contextual residual network with active learning for remote sensing image classification. IGARSS, 2018. DOI: https://doi.org/10.1109/IGARSS.2018.8517855

[7] Tun N.L., Gavrilov A., Tun N.M. Facial image denoising using convolutional autoencoder network. IEEE ICIEAM, 2020. DOI: https://doi.org/10.1109/ICIEAM48468.2020.9112080

[8] Zhang H., Meng L., Wei X., et al. 1D-convolutional capsule network for hyperspectral image classification. Computer Vision and Pattern Recognition. URL: https://arxiv.org/abs/1903.09834

[9] Gao Q., Lim S., Jia X. Hyperspectral image classification using convolutional neural networks and multiple feature learning. Remote Sens., 2018, vol. 8, no. 2, art. 299. DOI: https://doi.org/10.3390/rs10020299

[10] Ahmad M. A fast 3D CNN for hyperspectral image classification. Image and Video Processing. URL: https://arxiv.org/abs/2004.14152

[11] Roy S.K., Krishna G., Dubey S.R., et al. HybridSN: exploring 3-D--2-D CNN feature hierarchy for hyperspectral image classification. IEEE Geosci. Remote Sens. Lett., 2020, vol. 17, iss. 2, pp. 277--281. DOI: https://doi.org/10.1109/LGRS.2019.2918719

[12] He M., Li B., Chen H. Multi-scale 3D deep convolutional neural network for hyperspectral image classification. IEEE ICIP, 2017, pp. 3904--3908. DOI: https://doi.org/10.1109/ICIP.2017.8297014

[13] Mou L., Ghamisi P., Zhu X.X. Unsupervised spectral-spatial feature learning via deep residual conv--deconv network for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens., 2018, vol. 56, iss. 1, pp. 391--406. DOI: https://doi.org/10.1109/TGRS.2017.2748160

[14] Paoletti M.E., Haut J.M., Fernandez-Beltran R., et al. Capsule networks for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens., 2019, vol. 57, iss. 4, pp. 2145--2160. DOI: https://doi.org/10.1109/TGRS.2018.2871782

[15] Ji S., Xu W., Yang M., et al. 3D convolutional neural networks for human action recognition. IEEE Trans. Pattern Anal. Mach. Intell., 2013, vol. 35, iss. 1, pp. 221--231. DOI: https://doi.org/10.1109/TPAMI.2012.59

[16] Lv W., Wang X. Overview of hyperspectral image classification. J. Sens., 2020, vol. 2020, art. ID 4817234. DOI: https://doi.org/10.1155/2020/4817234

[17] Song W., Li S., Fang L., et al. Hyperspectral image classification with deep feature fusion network. IEEE Trans. Pattern Anal. Mach. Intell., 2018, vol. 56, iss. 6, pp. 3173--3184. DOI: https://doi.org/10.1109/TGRS.2018.2794326

[18] Congalton R.G., Mead R.A. A quantitative method to test for consistency and correctness in photointerpretation. Photogramm. Eng. Remote Sensing, 1983, vol. 49, no. 1, pp. 69--74.