Computer Science and Information Systems 2023 Volume 20, Issue 4, Pages: 1289-1310
https://doi.org/10.2298/CSIS230315054Y
Full text ( 593 KB)
Cited by


M2F2-RCNN: Multi-functional faster RCNN based on multi-scale feature fusion for region search in remote sensing images

Yin Shoulin (College of Information and Communication Engineering, Harbin Engineering University Harbin, China), yslin@hit.edu.cn
Wang Liguo (College of Information and Communications Engineering, Dalian Minzu University Dalian, China), wangliguo@hrbeu.edu.cn
Wang Qunming (College of Surveying and Geo-Informatics, Tongji University Shanghai, China), wqm@163.com
Ivanović Mirjana ORCID iD icon (Faculty of Sciences, University of Novi Sad Novi Sad, Serbia), mira@dmi.uns.ac.rs
Yang Jinghui (School of Information Engineering, China University of Geosciences Beijing, China), yang06081102@163.com

In order to realize fast and accurate search of sensitive regions in remote sensing images, we propose a multi-functional faster RCNN based on multi-scale feature fusion model for region search. The feature extraction network is based on ResNet50 and the dilated residual blocks are utilized for multi-layer and multi-scale feature fusion. We add a path aggregation network with a convolution block attention module (CBAM) attention mechanism in the backbone network to improve the efficiency of feature extraction. Then, the extracted feature map is processed, and RoIAlign is used to improve the pooling operation of regions of interest and it can improve the calculation speed. In the classification stage, an improved nonmaximum suppression is used to improve the classification accuracy of the sensitive region. Finally, we conduct cross validation experiments on Google Earth dataset and the DOTA dataset. Meanwhile, the comparison experiments with the state -of the- art methods also prove the high efficiency of the proposed method in region search ability.

Keywords: remote sensing images, region search, multi-functional faster RCNN, multi-scale feature fusion, convolution block attention module


Show references