|
|
Fringe removal algorithms for atomic absorption images: A survey |
Gaoyi Lei(雷高益)1, Chencheng Tang(唐陈成)2,†, and Yueyang Zhai(翟跃阳)3,‡ |
1 School of Instrumentation and Optoelectronic Engineering, Beihang University, Beijing 100191, China; 2 Quantum Sensing Center, Zhejiang Laboratory, Hangzhou 310000, China; 3 Research Institute of Frontier Science, Beihang University, Beijing 100191, China |
|
|
Abstract The fringe noises disrupt the precise measurement of the atom distribution in the process of the absorption images. The fringe removal algorithms have been proposed to reconstruct the ideal reference images of the absorption images to remove the fringe noises. However, the focus of these fringe removal algorithms is the association of the fringe removal performance with the physical systems, leaving the gap to analyze the workflows of different fringe removal algorithms. This survey reviews the fringe removal algorithms and classifies them into two categories: the image-decomposition based methods and the deep-learning based methods. Then this survey draws the workflow details of two classical fringe removal algorithms, and conducts experiments on the absDL ultracold image dataset. Experiments show that the singular value decomposition (SVD) method achieves outstanding performance, and the U-net method succeeds in implying the image inpainting idea. The main contribution of this survey is the interpretation of the fringe removal algorithms, which may help readers have a better understanding of the research status. Codes in this survey are available at https://github.com/leigaoyi/Atomic_Fringe_Denoise.
|
Received: 01 September 2021
Revised: 18 October 2021
Accepted manuscript online:
|
PACS:
|
03.75.-b
|
|
|
03.75.Be
|
(Atom and neutron optics)
|
|
67.85.-d
|
(Ultracold gases, trapped gases)
|
|
Fund: This research was founded by the National Natural Science Foundation of China (Grant No.62003020). |
Corresponding Authors:
Chencheng Tang,E-mail:tangchencheng@zhejianglab.com;Yueyang Zhai,E-mail:yueyangzhai@buaa.edu.cnn
E-mail: tangchencheng@zhejianglab.com;yueyangzhai@buaa.edu.cnn
|
About author: 2021-11-8 |
Cite this article:
Gaoyi Lei(雷高益), Chencheng Tang(唐陈成), and Yueyang Zhai(翟跃阳) Fringe removal algorithms for atomic absorption images: A survey 2022 Chin. Phys. B 31 050313
|
[1] D’Amico G, Borselli F, Cacciautoti L, Prevedelli M, Rose G, Sorrentino F and Tino G M 2016 Phys. Rev. A 93 063628 [2] Hardman K S, Everitt P J, McDonald G D, Manju P, Wigley P B, Sooriyabandara M A, Kuhn C C N, Debs J E, Close J D and Robins N P 2016 Phys. Rev. Lett. 117 138501 [3] Lu J X, Qian Z, Fang J C and Quuan W 2016 Appl. Phys. B 122 59 [4] Bloch I, Dalibard J and Zwerger W 2008 Rev. Mod. Phys. 80 885 [5] Schreiber M, Hodgman S S, Bordia P, Lüshen H P, Fischer MH, Vosk R, Altman E and Schneider U 2015 Science 349 842 [6] Salim E A, Caliga S C, Pfeiffer J B and Anderson D Z 2013 Appl. Phys. Lett. 102 084104 [7] Yue C, Tang X S, Li Y F et al. 2021 Chin. Phys. B 30 097803 [8] Sun K, Huang L J, Cheng X A and Jiang H M 2011 Opt. Express 24 23901 [9] Boundaoud F and Lemerini M 2015 Chin. Phys. B 24 075205 [10] Ockloen C F, Schmied R, Riedel M F and Treutlein P 2013 Phys. Rev. Lett. 111 143001 [11] Lu J X, Qian Z, Fang J C and Quan W 2015 Rev. Scient. Instr. 86 083103 [12] Egorov M, Opanchuk B, Drummond P, Hall B V, Hannaford P and Sidorov A I 2013 Phys. Rev. A 87 053614 [13] Sun G Y, Ma N S, Zhao B W, Sandvik A W and Meng Z Y 2021 Chin. Phys. B 30 067505 [14] Ries M G, Wenz A N, Zürn G, Bayha L, Boettcher I, Kedar D, Murthy P A, Neidig M, Lompe T and Jochim S 2015 Phys. Rev. Lett. 114 230401 [15] Hadzibabic Z, Krüger P, Cheneau M, Battelier B and Dalibard J 2006 Nature 441 1118 [16] Feng X, Yun L and Colin V P 2020 JOSA B 37 2041 [17] Cao S Y, Tang P J, Guo X X, Chen X Z, Zhang W and Zhou X J 2019 Opt. Soc. America 27 12710 [18] Niu L X, Guo X X, Zhan Y, Chen X Z, Niu W M and Zhou X J 2019 Appl. Phys. Lett. 113 144103 [19] Ockeloen C F, Tauschinsky A F, Spreeuw R J C and Whitlock S 2010 Phys. Rev. A 82 061606 [20] Li X L, Ke M, Yan B and Wang Y Z 2007 Chin. Opt. Soc. 5 128 [21] Song B, He C D, Ren Z, Zhao E, Lee J W and Jo G B 2020 Phys. Rev. Appl. 14 034006 [22] Ness G, Vainbaum A, Shkedrov C, Florshaim Y and Sagi Y 2020 Phys. Rev. Applied 14 014011 [23] Ronneberger O, Fischer P and Brox T 2015 Medical Image Computing and Computer-Assisted Intervention, October 5–9, 2015, Munich, Germany, p. 234 [24] Tang J, Jin Z B, Zhou X, Zhang W J, Wu M, Shen Q H, Cheng Q, Wang X D and Yuan J 2019 Chin. Phys. B 28 038701 [25] Santurkar S, Tripras D, ILyas A and Madry A 2018 Proceedings of the 32nd international conference on neural information processing systems, December 3–8, 2018, Montréal, Canada, p. 2488 [26] Hara K, Saito D and Shouno H 2015 International Joint Conference on Neural Networks (IJCNN), July 12–17, 2015, Killarney, Ireland, p. 1 [27] Abadi M, Barham P, Chen J M, et al. 2016 Proceedings of the 12th USENIX Conference on Operating Systems Design and Implementation, November 2–4, 2016, Berkeley, United States, p. 265 [28] Yu J H, Lin Z, Yang J M, Shen X H, Lu X and Huang T 2018 Proceedings of the IEEE conference on computer vision and pattern recognition, June 18–23, 2018, Salt Lake, USA, p. 5505 [29] Armanious K, Mechy Y, Gatidis S and Yang B 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), May 12–17, 2019, Brighton, UK, p. 3267 [30] Yeh R A, Chen C, Lim T Y, Schwing A G, Johnson M H and Do M N 2017 Proceedings of the IEEE conference on computer vision and pattern recognition, July 21–26, 2017, Hawaii, USA, p. 5485 [31] Yuan L C, Ruan C C, Hu H F and Chen D H 2019 IEEE Access 7 46411 [32] Liu T, Tian B, Li L, Cao D P and Wang F Y 2017 Acta Automatica Sinica 43 1 [33] Arjovsky M, Chintala S and Bottou L 2017 International conference on machine learning, August 6–11, 2017, Sydney, Australia, p. 214 [34] Pan S J and Yang Q 2011 Opt. express 24 23901 [35] Pan S J, Tsang I, Kwok J and Yang Q 2010 IEEE Transact. Neural Networks 22 199 |
No Suggested Reading articles found! |
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
Altmetric
|
blogs
Facebook pages
Wikipedia page
Google+ users
|
Online attention
Altmetric calculates a score based on the online attention an article receives. Each coloured thread in the circle represents a different type of online attention. The number in the centre is the Altmetric score. Social media and mainstream news media are the main sources that calculate the score. Reference managers such as Mendeley are also tracked but do not contribute to the score. Older articles often score higher because they have had more time to get noticed. To account for this, Altmetric has included the context data for other articles of a similar age.
View more on Altmetrics
|
|
|