Frontiers of Data and Computing ›› 2025, Vol. 7 ›› Issue (5): 138-152.
CSTR: 32002.14.jfdc.CN10-1649/TP.2025.05.011
doi: 10.11871/jfdc.issn.2096-742X.2025.05.011
• Technology and Application • Previous Articles Next Articles
WANG Peng1,2,*(
),YANG Xiaofeng1,HE Zhongchen1,DU Jun1,3
Received:2025-04-27
Online:2025-10-20
Published:2025-10-23
Contact:
WANG Peng
E-mail:Pengwang_B614080003@nuaa.edu.cn
WANG Peng,YANG Xiaofeng,HE Zhongchen,DU Jun. Multispectral Remote Sensing Image Pansharpening Method Based on Shallow-Deep Convolutional Recurrent Neural Network[J]. Frontiers of Data and Computing, 2025, 7(5): 138-152, https://cstr.cn/32002.14.jfdc.CN10-1649/TP.2025.05.011.
Table 2
Pansharpening method based on DL"
| 方法 | PNN | PanNet | MSDCNN | TFNet | GPPNN | MCRNN |
|---|---|---|---|---|---|---|
| 迭代次数 | 1.12×106 | 2.5×105 | 2.5×105 | 1.1×106 | 2.4×105 | 1.6×105 |
| 批大小 | 128 | 16 | 64 | 32 | 16 | 64 |
| 核尺寸 | 9×9 5×5 | 3×3 | 7×7 3×3 1×1 | 3×3 2×2 1×1 | 3×3 | 3×3 1×1 |
| 学习率 | 0.00001 | 0.001 | 0.1 | 0.0001 | 0.00005 | 0.00001 |
| 优化方法 | SGD | SGD | SGD | Adam | Adam | Adam |
| 损失函数 | MSE Loss | MSE Loss | MSE Loss | L1 Loss | L1 Loss | MSE Loss |
Table 3
Quantitative evaluation results of QuickBird dataset"
| 方法 | Quick Bird | ||
|---|---|---|---|
| SAM | ERGAS | Q4 | |
| Reference | 0.00 | 0.00 | 1.00 |
| EXP | 1.18±0.30 | 1.18±0.36 | 0.72±0.04 |
| Brovey | 0.93±0.19 | 1.17±0.31 | 0.75±0.07 |
| CNMF | 0.93±0.2 | 1.13±0.33 | 0.79±0.05 |
| GS | 0.79±0.13 | 0.97±0.18 | 0.80±0.07 |
| GSA | 0.69±0.14 | 0.88±0.18 | 0.82±0.08 |
| SFIM | 0.79±0.15 | 0.92±0.20 | 0.81±0.07 |
| PRACS | 1.08±0.26 | 1.25±0.36 | 0.76±0.05 |
| PNN | 0.73±0.12 | 0.80±0.15 | 0.84±0.07 |
| PanNet | 0.70±0.12 | 0.86±0.18 | 0.81±0.09 |
| MSDCNN | 0.68±0.14 | 0.76±0.19 | 0.84±0.07 |
| TFNet | 0.64±0.12 | 0.83±0.16 | 0.83±0.07 |
| GPPNN | 0.55±0.16 | 0.84±0.25 | 0.84±0.10 |
| MCRNN | 0.49±0.12 | 0.72±0.16 | 0.85±0.07 |
Table 4
Quantitative evaluation results of WorldView-4 dataset"
| 方法 | WorldView-4 | ||
|---|---|---|---|
| SAM | ERGAS | Q4 | |
| Reference | 0.00 | 0.00 | 1.00 |
| EXP | 3.61±0.74 | 2.26±0.82 | 0.70±0.09 |
| Brovey | 3.39±0.55 | 2.30±0.72 | 0.75±0.14 |
| CNMF | 3.07±0.51 | 2.15±0.67 | 0.80±0.14 |
| GS | 3.56±0.88 | 2.88±1.12 | 0.75±0.16 |
| GSA | 3.11±0.63 | 2.26±0.64 | 0.80±0.13 |
| SFIM | 3.64±0.90 | 2.41±0.79 | 0.81±0.12 |
| PRACS | 2.83±0.70 | 2.80±1.04 | 0.78±0.10 |
| PNN | 2.33±0.62 | 2.45±0.81 | 0.74±0.15 |
| PanNet | 2.29±0.55 | 2.42±0.49 | 0.71±0.18 |
| MSDCNN | 2.06±0.58 | 2.03±0.64 | 0.77±0.13 |
| TFNet | 2.24±0.61 | 2.38±0.89 | 0.74±0.10 |
| GPPNN | 2.07±0.54 | 2.08±0.57 | 0.78±0.13 |
| MCRNN | 1.68±0.42 | 1.82±0.58 | 0.83±0.11 |
Table 5
Quantitative evaluation results of WorldView-2 dataset"
| 方法 | WorldView-2 | ||
|---|---|---|---|
| SAM | ERGAS | Q8 | |
| Reference | 0.00 | 0.00 | 1.00 |
| EXP | 5.34±0.58 | 6.62±1.17 | 0.58±0.05 |
| Brovey | 5.31±0.63 | 6.56±1.05 | 0.72±0.07 |
| CNMF | 4.03±0.52 | 5.71±0.93 | 0.73±0.08 |
| GS | 5.38±0.64 | 7.01±1.02 | 0.71±0.06 |
| GSA | 4.61±0.55 | 6.90±1.37 | 0.78±0.07 |
| SFIM | 6.78±3.88 | 6.50±1.21 | 0.81±0.08 |
| PRACS | 5.21±0.61 | 7.52±1.34 | 0.78±0.06 |
| PNN | 3.52±0.37 | 5.83±0.86 | 0.73±0.07 |
| PanNet | 3.09±0.34 | 4.93±0.75 | 0.81±0.09 |
| MSDCNN | 2.96±0.31 | 4.76±0.59 | 0.81±0.09 |
| TFNet | 2.99±0.23 | 4.88±0.62 | 0.81±0.05 |
| GPPNN | 2.96±0.31 | 4.76±0.59 | 0.81±0.09 |
| MCRNN | 2.92±0.32 | 4.71±0.66 | 0.86±0.09 |
Table 6
Quantitative evaluation results of QuickBird dataset"
| 方法 | Quick bird | ||
|---|---|---|---|
| SAM | ERGAS | Q4 | |
| Reference | 0.000 | 0.000 | 1.000 |
| EXP | 0.05±0.02 | 0.07±0.03 | 0.87±0.04 |
| Brovey | 0.05±0.01 | 0.12±0.04 | 0.83±0.04 |
| CNMF | 0.10±0.02 | 0.17±0.05 | 0.74±0.06 |
| GS | 0.04±0.01 | 0.06±0.02 | 0.89±0.02 |
| GSA | 0.04±0.01 | 0.06±0.02 | 0.89±0.02 |
| SFIM | 0.03±0.01 | 0.06±0.02 | 0.89±0.02 |
| PRACS | 0.08±0.02 | 0.13±0.04 | 0.79±0.04 |
| PNN | 0.04±0.01 | 0.06±0.02 | 0.89±0.03 |
| PanNet | 0.04±0.01 | 0.06±0.02 | 0.89±0.02 |
| MSDCNN | 0.04±0.00 | 0.06±0.02 | 0.895±0.02 |
| TFNet | 0.04±0.01 | 0.06±0.02 | 0.90±0.02 |
| GPPNN | 0.04±0.02 | 0.07±0.04 | 0.89±0.05 |
| MCRNN | 0.03±0.01 | 0.05±0.03 | 0.90±0.04 |
Table 7
Quantitative evaluation results of WorldView-4 dataset"
| 方法 | WorldView-4 | ||
|---|---|---|---|
| SAM | ERGAS | Q4 | |
| Reference | 0.000 | 0.000 | 1.000 |
| EXP | 0.04±0.01 | 0.09±0.03 | 0.86±0.03 |
| Brovey | 0.07±0.05 | 0.10±0.04 | 0.82±0.07 |
| CNMF | 0.11±0.08 | 0.14±0.06 | 0.76±0.11 |
| GS | 0.05±0.02 | 0.09±0.04 | 0.88±0.05 |
| GSA | 0.05±0.02 | 0.09±0.03 | 0.86±0.05 |
| SFIM | 0.05±0.02 | 0.07±0.02 | 0.87±0.04 |
| PRACS | 0.09±0.08 | 0.10±0.03 | 0.81±0.09 |
| PNN | 0.08±0.05 | 0.16±0.06 | 0.76±0.09 |
| PanNet | 0.06±0.02 | 0.08±0.03 | 0.85±0.04 |
| MSDCNN | 0.08±0.04 | 0.16±0.06 | 0.76±0.08 |
| TFNet | 0.05±0.02 | 0.07±0.02 | 0.87±0.04 |
| GPPNN | 0.05±0.02 | 0.08±0.02 | 0.86±0.03 |
| MCRNN | 0.04±0.01 | 0.06±0.02 | 0.89±0.03 |
Table 8
Quantitative evaluation results of WorldView-2 dataset"
| 方法 | WorldView-2 | ||
|---|---|---|---|
| SAM | ERGAS | Q8 | |
| Reference | 0.000 | 0.000 | 1.000 |
| EXP | 0.03±0.00 | 0.02±0.01 | 0.79±0.01 |
| Brovey | 0.05±0.03 | 0.10±0.04 | 0.84±0.06 |
| CNMF | 0.07±0.06 | 0.08±0.09 | 0.85±0.12 |
| GS | 0.06±0.03 | 0.01±0.05 | 0.84±0.07 |
| GSA | 0.04±0.03 | 0.05±0.07 | 0.90±0.09 |
| SFIM | 0.04±0.04 | 0.05±0.06 | 0.90±0.10 |
| PRACS | 0.05±0.06 | 0.05±0.08 | 0.08±0.12 |
| PNN | 0.06±0.04 | 0.06±0.02 | 0.88±0.05 |
| PanNet | 0.04±0.03 | 0.05±0.06 | 0.90±0.08 |
| MSDCNN | 0.05±0.05 | 0.07±0.07 | 0.88±0.11 |
| TFNet | 0.05±0.06 | 0.08±0.08 | 0.87±0.12 |
| GPPNN | 0.08±0.08 | 0.08±0.10 | 0.85±0.14 |
| MCRNN | 0.03±0.01 | 0.04±0.01 | 0.92±0.02 |
| [1] | SCHMITT M, ZHU X X. Data fusion and remote sensing: an ever-growing relationship[J]. IEEE Geoscience and Remote Sensing Magazine, 2016, 4(4): 6-23. |
| [2] | 张良培, 沈焕锋. 遥感数据融合的进展与前瞻[J]. 遥感学报, 2016, 20(5): 1050-1061. |
| [3] | ESLAMI M, MOHAMMADZADEH A. Developing a spectral-based strategy for urban object detection from airborne hyperspectral TIR and visible data[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2016, 9(5): 1808-1816. |
| [4] | 敬忠良, 肖刚, 李振华. 图像融合-理论与应用[M]. 北京: 高等教育出版社, 2007. |
| [5] | CARPER W, LILLESAND T, KIEFER R. The use of intensity-hue-saturation transformations for merging SPOT panchromatic and multispectral image data[J]. Photogrammetric Engineering and Remote Sensing, 1990, 56(4): 459-467. |
| [6] | CHAVEZ P J R, KWARTENG A. Extracting spectral contrast in Landsat Thematic Mapper image data using selective principal component analysis[J]. Photogrammetric Engineering and Remote Sensing, 1989, 55(3): 339-348. |
| [7] | SHETTIGARA V. A generalized component substitution technique for spatial enhancement of multispectral images using a higher resolution data set[J]. Photogrammetric Engineering and Remote Sensing, 1992, 58(5): 561-567. |
| [8] | GILLESPIE A R, KAHLE A B, WALKER R E. Ⅱ. Color enhancement of highly correlated images. Channel ratio and “chromaticity” transformation techniques-ScienceDirect[J]. Remote Sensing of Environment, 1987, 22(3): 343-365. |
| [9] | VIVONE G, RESTAINO R, MURA M D, et al. Contrast and error-based fusion schemes for multispecral image pansharpening[J]. IEEE Geoscience and Remote Sensing Letters, 2014, 11(5): 930-934. |
| [10] | WALD L, RANCHIN T, MANGOLINI M. Fusion of satellite images of different spatial resolutions: assessing the quality of resulting images[J]. Photogrammetric Engineering and Remote Sensing, 1997, 63(6):691-699. |
| [11] | MALLA S G. A theory for multiresolution signal decomposition: the wavelet representation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1989, 11(7): 674-693. |
| [12] |
DO M N, VETTERLI M. The contourlet transform: an efficient directional multiresolution image representation[J]. IEEE Transactions on image Processing, 2005, 14(12): 2091-2106.
doi: 10.1109/tip.2005.859376 pmid: 16370462 |
| [13] | BALLESTER C, CASELLES V, IGUAL L, et al. A variational model for P+XS image fusion[J]. Intemational Journal of Computer Vision, 2006, 69(1):43-58. |
| [14] | MASI G, COZZOLINO D, VERDOLIVA L, et al. Pansharpening by convolutional neural networks[J]. Remote Sensing, 2016, 8(7) 594: 1-594: 22. |
| [15] | LIU Q, ZHOU H, XU Q, et al. Psgan: A generative adversarial network for remote sensing image pansharpening[J]. IEEE Transactions on Geoscience and Remote Sensing, 2020, 59(12): 10227-10242. |
| [16] | HU J, HU P, KANG X, et al. FAN. Pan-Sharpening via Multiscale Dynamic Convolutional Neural Network[J]. IEEE Transactions on Geoscience and Remote Sensing, 2021, 59(3): 2231-2244. |
| [17] | H. ZHOU, Q. LIU AND Y. WANG. PGMAN: An Unsupervised Generative Multiadversarial Network for Pansharpening[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2021, 14: 6316-6327. |
| [18] | ZHAO Z, ZHANG R. A small sample bearing fault diagnosis method based on ConvGRU relation network[J]. Measurement Science and Technology, 2024, 35(6). |
| [19] | MENG X, XIONG Y M, SHAO F, et al. A Large-Scale Benchmark Data Set for Evaluating Pansharpening Performance: Overview and Implementation[J]. IEEE Geoscience and Remote Sensing Magazine, 2021, 9(1): 18-52. |
| [1] | XIN Yuhang,WANG Qiyi,SUN Jing,ZHAO Chunyan,LIU Yujia,LIANG Xue,CHEN Jie. Application of Radar Echo Extrapolation Based Model TrajCast on Domestic Accelerators for Short-Term and Imminent Precipitation Forecasting [J]. Frontiers of Data and Computing, 2025, 7(5): 113-122. |
| [2] | ZENG Yan,WU Baofu,YI Guangzheng,HUANG Chengchuang,QIU Yang,CHEN Yue,WAN Jian,HU Fan,JIN Sicong,LIANG Jiajun,LI Xin. FlowAware: A Feature-Aware Automated Model Parallelization Method for AI-for-Science Tasks [J]. Frontiers of Data and Computing, 2025, 7(5): 65-87. |
| [3] | JIA Ziang. Teeth Structure Segmentation Based on Multi-Source Semi-Supervised Learning [J]. Frontiers of Data and Computing, 2025, 7(2): 175-185. |
| [4] | MA Qiuping, ZHANG Qi, ZHAO Xiaofan. Review of Research on Chart Question Answering [J]. Frontiers of Data and Computing, 2025, 7(1): 19-37. |
| [5] | SHUI Yingyi, ZHANG Qi, LI Gen, ZHANG Shihao, WU Shang. A Review of Research on Social Network Influence Prediction Based on Multi-Class Features [J]. Frontiers of Data and Computing, 2025, 7(1): 2-18. |
| [6] | JIN Jiali, GAO Siyuan, GAO Manda, WANG Wenbin, LIU Shaozhen, SUN Zhenan. A Survey of Face Age Editing Based on Generative Adversarial Networks and Diffusion Models [J]. Frontiers of Data and Computing, 2025, 7(1): 38-55. |
| [7] | LU Chenghao,CHEN Xiuhong. IPDFF: Reconstructed Surface Network Based on Implicit Partition Learning Deep Feature Fusion [J]. Frontiers of Data and Computing, 2024, 6(6): 19-31. |
| [8] | WEI Yijin,FAN Jingchao. Classification Model of Agricultural Science and Technology Policies Based on Improved BERT-BiGRU-Attention [J]. Frontiers of Data and Computing, 2024, 6(6): 53-61. |
| [9] | HE Wentong,LUO Ze. Object Detection with Federated Learning for Wildlife Camera Trap Images [J]. Frontiers of Data and Computing, 2024, 6(6): 85-96. |
| [10] | YAN Zhiyu, RU Yiwei, SUN Fupeng, SUN Zhenan. Research on Video Behavior Recognition Method with Active Perception Mechanism [J]. Frontiers of Data and Computing, 2024, 6(5): 66-79. |
| [11] | LIAO Libo, WANG Shudong, SONG Weimin, ZHANG Zhaoling, LI Gang, HUANG Yongsheng. The Study of Jet Tagging Algorithm Based on DeepSets at CEPC [J]. Frontiers of Data and Computing, 2024, 6(3): 108-115. |
| [12] | YAN Jin, DONG Kejun, LI Hongtao. A Deep Web Tracker Detection Method with Coordinated Semantic and Co-Occurrence Features [J]. Frontiers of Data and Computing, 2024, 6(3): 127-138. |
| [13] | KOU Dazhi. Automatic Teeth Segmentation on Dental Panoramic Radiographs with Deep Learning [J]. Frontiers of Data and Computing, 2024, 6(3): 162-172. |
| [14] | CAI Chengfei, LI Jun, JIAO Yiping, WANG Xiangxue, GUO Guanchen, XU Jun. Progress and Challenges of Medical Multimodal Data Fusion Methods Based on Deep Learning in Oncology [J]. Frontiers of Data and Computing, 2024, 6(3): 3-14. |
| [15] | ZHENG Yinuo, SUN Muyi, ZHANG Hongyun, ZHANG Jing, DENG Tianzheng, LIU Qian. Application of Deep Learning in Dental Implant Imaging: Research Progress and Challenges [J]. Frontiers of Data and Computing, 2024, 6(3): 41-49. |
| Viewed | ||||||
|
Full text |
|
|||||
|
Abstract |
|
|||||
