Special Issue:
SPECIAL TOPIC — 80th Anniversary of Northwestern Polytechnical University (NPU)
|
SPECIAL TOPIC—80th Anniversary of Northwestern Polytechnical University (NPU) |
Prev
Next
|
|
|
An infrared and visible image fusion method based uponmulti-scale and top-hat transforms |
Gui-Qing He(何贵青)1, Qi-Qi Zhang(张琪琦)1, Jia-Qi Ji(纪佳琪)1, Dan-Dan Dong(董丹丹)1, Hai-Xi Zhang(张海曦)1, Jun Wang(王珺)2 |
1 School of Electronics and Information, Northwestern Polytechnical University, Xi'an 710072, China;
2 School of Information Technology, Northwestern University, Xi'an 710072, China |
|
|
Abstract The high-frequency components in the traditional multi-scale transform method are approximately sparse, which can represent different information of the details. But in the low-frequency component, the coefficients around the zero value are very few, so we cannot sparsely represent low-frequency image information. The low-frequency component contains the main energy of the image and depicts the profile of the image. Direct fusion of the low-frequency component will not be conducive to obtain highly accurate fusion result. Therefore, this paper presents an infrared and visible image fusion method combining the multi-scale and top-hat transforms. On one hand, the new top-hat-transform can effectively extract the salient features of the low-frequency component. On the other hand, the multi-scale transform can extract high-frequency detailed information in multiple scales and from diverse directions. The combination of the two methods is conducive to the acquisition of more characteristics and more accurate fusion results. Among them, for the low-frequency component, a new type of top-hat transform is used to extract low-frequency features, and then different fusion rules are applied to fuse the low-frequency features and low-frequency background; for high-frequency components, the product of characteristics method is used to integrate the detailed information in high-frequency. Experimental results show that the proposed algorithm can obtain more detailed information and clearer infrared target fusion results than the traditional multi-scale transform methods. Compared with the state-of-the-art fusion methods based on sparse representation, the proposed algorithm is simple and efficacious, and the time consumption is significantly reduced.
|
Received: 03 July 2018
Revised: 02 October 2018
Accepted manuscript online:
|
PACS:
|
87.50.W-
|
|
|
85.60.Gz
|
(Photodetectors (including infrared and CCD detectors))
|
|
82.50.Bc
|
(Processes caused by infrared radiation)
|
|
61.80.Ba
|
(Ultraviolet, visible, and infrared radiation effects (including laser radiation))
|
|
Fund: Project supported by the National Natural Science Foundation of China (Grant No. 61402368), Aerospace Support Fund, China (Grant No. 2017-HT-XGD), and Aerospace Science and Technology Innovation Foundation, China (Grant No. 2017 ZD 53047). |
Corresponding Authors:
Hai-Xi Zhang, Hai-Xi Zhang
E-mail: zh.haixi@gmail.com;jwang@nwu.edu.cn
|
Cite this article:
Gui-Qing He(何贵青), Qi-Qi Zhang(张琪琦), Jia-Qi Ji(纪佳琪), Dan-Dan Dong(董丹丹), Hai-Xi Zhang(张海曦), Jun Wang(王珺) An infrared and visible image fusion method based uponmulti-scale and top-hat transforms 2018 Chin. Phys. B 27 118706
|
[1] |
Li S T, Kang X D, Fang L Y, Hu J W and Yin H T 2017 Inform. Fusion. 33 100
|
[2] |
Zhang Q, Liu Y, Blum R S, Han J G and Tao D C 2018 Inform. Fusion. 40 57
|
[3] |
Mishra D and Palkar B 2015 Int. J. Comput. Appl. 130 7
|
[4] |
Ma J Y, Chen C, Li C and Huang J 2016 Inform. Fusion. 31 100
|
[5] |
Cui G M, Feng H J, Xu Z H, Li Q and Chen Y T 2015 Opt. Commun. 341 199
|
[6] |
Bavirisetti D P and Dhuli R 2016 Ain Shams Engineering Journal
|
[7] |
Sahu A, Bhateja V, Krishn A, et al. 2015 IEEE International Conference on Medical Imaging, 2015, p. 448
|
[8] |
Nooshyar M, Abdipour M and Khajuee M 2011 Multi-focus Image Fusion for Visual Sensor Networks in Wavelet Domain (New York:Pergamon Press) pp. 789-797
|
[9] |
Mehra I and Nishchal N K 2015 Opt. Commun. 335 153
|
[10] |
Ye M and Tang D B 2015 J. Electr. Measur. Instr. 29 1328(in Chinese)
|
[11] |
Yang Y 2010 IEEE International Conference on Bioinformatics and Biomedical Engineering, 2010, p. 1
|
[12] |
Wang X H, Zhou Y and Zhou Y 2017 Comput. Eng. Desig. 38 729(in Chinese)
|
[13] |
Yu B, Jia B, Ding L, et al. 2016 Neurocomputing. 182 1
|
[14] |
Sruthy S, Parameswaran L and Sasi A P 2013 IEEE International Multi-Conference on Automation, 2013, p. 160
|
[15] |
Liu S Q, Shao-Hai H U, Zhao J, et al. 2016 J. Signal. Process. (in Chinese)
|
[16] |
Chen X P, Yang X Z and Dong Z Y 2017 Remot. Sens. Inf. 32 68(in Chinese)
|
[17] |
Yang Y, Song T, Huang S, et al. 2015 IEEE. Sens. J. 15 2824
|
[18] |
Zhao C, Guo Y and Wang Y 2015 Infrared. Phys. Techn. 72 266
|
[19] |
Bai X, Zhou F and Xue B 2011 Opt. Express 19 8444
|
[20] |
Chen T M, Wang J Q, Zhang X X, et al. 2016 Laser. Infr. 46 357(in Chinese)
|
[21] |
Yang B and Li S 2010 IEEE. T. Instrum. Meas. 59 884
|
[22] |
Yu N, Qiu T, Bi F, et al. 2011 IEEE. J-STSP. 5 1074
|
[23] |
Liu C H, Qi Y and Ding W R 2017 Infrared. Phys. Techn. 83 94
|
[24] |
He G Q and Hao C Y 2007 Comput. Eng. Appl. 43 71(in Chinese)
|
[25] |
Xydeas C S and Petrovic V 2000 Military Technical Courier 56 181
|
[26] |
Piella G and Heijmans H 2003 IEEE International Conference on Image Processing 2003, p. 173
|
No Suggested Reading articles found! |
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
Altmetric
|
blogs
Facebook pages
Wikipedia page
Google+ users
|
Online attention
Altmetric calculates a score based on the online attention an article receives. Each coloured thread in the circle represents a different type of online attention. The number in the centre is the Altmetric score. Social media and mainstream news media are the main sources that calculate the score. Reference managers such as Mendeley are also tracked but do not contribute to the score. Older articles often score higher because they have had more time to get noticed. To account for this, Altmetric has included the context data for other articles of a similar age.
View more on Altmetrics
|
|
|