Abstract Vector neural network (VNN) is one of the most important methods to process interval data. However, the VNN, which contains a great number of multiply-accumulate (MAC) operations, often adopts pure numerical calculation method, and thus is difficult to be miniaturized for the embedded applications. In this paper, we propose a memristor based vector-type backpropagation (MVTBP) architecture which utilizes memristive arrays to accelerate the MAC operations of interval data. Owing to the unique brain-like synaptic characteristics of memristive devices, e.g., small size, low power consumption, and high integration density, the proposed architecture can be implemented with low area and power consumption cost and easily applied to embedded systems. The simulation results indicate that the proposed architecture has better identification performance and noise tolerance. When the device precision is 6 bits and the error deviation level (EDL) is 20%, the proposed architecture can achieve an identification rate, which is about 92% higher than that for interval-value testing sample and 81% higher than that for scalar-value testing sample.
Liu H J, Liu Z, Jiang W L and Zhou Y Y 2010 IET Signal Process. 4 137
[2]
Shieh C and Lin C 2002 IEEE T. Anten. Propag. 50 1120
[3]
Chen X, Li D, Yang X and Li H 2018 Int. J. Aeronaut. Space 19 685
[4]
Chen X and Hu W D 2012 Electron. Lett. 48 1156
[5]
Sun J, Xu G, Ren W and Yan Z 2018 IET Radar Sonar. Nav. 12 862
[6]
Chua L 1971 IEEE Trans. Circuit Theory 18 507
[7]
Strukov D B, Snider G S, Stewart D R and Williams R S 2008 Nature 453 80
[8]
Sun Y, Xu H, Liu S, Song B, Liu H, Liu Q and Li Q 2018 IEEE Electron. Dev. Lett. 39 492
[9]
Hua-Gan W, Bo-Cheng B and Mo C 2014 Chin. Phys. B 23 118401
[10]
Wang S, He C, Tang J, Yang R, Shi D and Zhang G 2019 Chin. Phys. B 28 017304
[11]
Upadhyay N K, Jiang H, Wang Z, Asapu S, Xia Q and Yang J J 2019 Adv. Mater. Technol. 4 1800589
[12]
Burr G W, Shelby R M, Sebastian A, Kim S, Kim S, Sidler S, Virwani K, Ishii M, Narayanan P, Fumarola A, Sanches L L, Boybat I, Le Gallo M, Moon K, Woo J, Hwang H and Leblebici Y 2017 Adv. Phys. X 2 89
[13]
Li C, Wang Z, Rao M, Belkin D, Song W, Jiang H, Yan P, Li Y, Lin P, Hu M, Ge N, Strachan J P, Barnell M, Wu Q, Williams R S, Yang J J and Xia Q 2019 Nat. Mach. Intell. 1 49
[14]
Zhou E, Fang L, Liu R and Tang Z 2017 Chin. Phys. B 26 118502
[15]
Li Z, Chen P, Xu H and Yu S 2017 IEEE T. Electron. Dev. 64 2721
[16]
Zhang X, Wang W, Liu Q, Zhao X, Wei J, Cao R, Yao Z, Zhu X, Zhang F, Lv H, Long S and Liu M 2018 IEEE Electron. Dev. Lett. 39 308
[17]
Li Y, Zhong Y, Zhang J, Xu L, Wang Q, Sun H, Tong H, Cheng X and Miao X 2015 Sci. Rep. 4 4096
[18]
Cai F, Correll J M, Lee S H, Lim Y, Bothra V, Zhang Z, Flynn M P and Lu W D 2019 Nature Electron. 2 290
[19]
Sun S, Xu H, Li J, Li Q and Liu H 2019 IEEE Access. 7 61679
[20]
Li Z, Chen P, Liu H, Li Q, Xu H and Yu S 2017 IEEE T. Electron. Dev. 64 1568
Altmetric calculates a score based on the online attention an article receives. Each coloured thread in the circle represents a different type of online attention. The number in the centre is the Altmetric score. Social media and mainstream news media are the main sources that calculate the score. Reference managers such as Mendeley are also tracked but do not contribute to the score. Older articles often score higher because they have had more time to get noticed. To account for this, Altmetric has included the context data for other articles of a similar age.