Advent of Memristor based synapses on ...

4 downloads 0 Views 607KB Size Report
1 Department of ECE, REVA Institute of Technology and Management,Bengaluru. 2,School of Electronics and Communication Engineering, REVA University, ...
Advent of Memristor based synapses on Neuromorphic Engineering Vidya S1 , Mohammed Riyaz Ahmed2 1

2,

Department of ECE, REVA Institute of Technology and Management,Bengaluru. School of Electronics and Communication Engineering, REVA University, Yelahanka, Bengaluru, India 560064 [email protected], [email protected]

Abstract—In our pursuit of human intelligence using complex hardware systems, many interdisciplinary domains have emerged. Neuromorphic Engineering is one such domain which aims to exploit the physical characteristics of semiconductors to implement neural systems (and architectures to some extent). Synapse act as connection site and by the virtue of this long lasting connections, learning is exhibited in neural systems where it is governed by Hebbian rule. Memristors can store charge during the training process and respond in a desired manner, electronically mimicking synapse behaviour. The plasticity is achieved by variation in memristance based on the temporal coincidences of pre and post-synaptic spikes. We review the recent progress on the development of 2-terminal resistive device called memristors to mimic neuronal synapse based on Spike-timingdependent plasticity protocol as an asymmetric form of hebbian learning. The properties of memristor such as high density, nonvolatility, and recording historic behaviour of current profile with energy efficiency (identical to biological systems) makes it unique and paves a way for memristor-based neuromorphic computing architectures. The use of Memristor to realise synaptic activity provides even further surprising properties in analog CMOS Neuromorphic design.

I. I NTRODUCTION Power dissipation has been increasing naturally due to technology scaling; clearly power wall is a roadblock in semiconductor industry. With current limitations of scaling, aggressive demand for future technology with reduced power dissipation and no new device (energy aware or energy harvesting) on the horizon; it is obvious to have a paradigm shift to enable scaling without any compromise on computing performance. As the current technology is product of a series of evolution from mechanical devices to current CMOS technology via relays, vacuum tubes, transistors, ICs etc., we are in a state of transition from CMOS to not only an energy efficient, highly parallel technology but also capable of learning and adaptability. The challenge in moving-off from the CMOS route is to have a very different hardware platform which may redefine entire abstraction hierarchy. An alternate approach which is not only power efficient but also contributes to computing performance is Neuromorphic engineering. The shift at architectural level from Von Neumann to Neural system has significant advantages. An adult human brain consumes around 12W against 100W/cm2 by our modern day computers despite being flexible, adaptive and fault tolerant. Nature has many lessons to teach, neuromorphic

Fig. 1. Memristor as missing fourth element [1]

engineers take inspiration from human brain and try to mimic cognitive behaviour in machines to exhibit intelligence. By virtue of plasticity learning and memory are realised in animal brain. Reconfigurability is demonstration of adaptability as animal which learns will adapt and thereby becomes an intelligent one. More the ability to reconfigure, adapt, learn and memorise more will be the intelligence. The continuous updates in synaptic weights will have an impact on neurons, synapses and tissue structure as a whole. This intrinsic plasticity (coined by Cudmore and Desai) affects the electrical properties of neurons and over a time will also influence the structure giving rise to structure plasticity. This may potentially involve creation of new connections, pruning of old, production of new neurons, growth of axons and dendrites of existing neurons etc. Neuromorphic engineers intend to develop synaptic plasticity as it is widely accepted to be the basis for learning and memory in nervous system. Synaptic plasticity and especially biologically realistic spike based plasticity learning is the thrust area in mimicking human intelligence. Neurons communicate through nodes called as synapses. Synaptic weights are used to store knowledge. According to Donald Hebb (1949), only few neurons are excited for some input as we apply the same input, same neural networks get excited thereby forming ’engrams’ (neurons get connected to become engrams)[2]. When one neuron repeatedly excites (or assist in firing) another neuron, the axon of first neuron develop synaptic nodes with the soma of another neuron. If

the connection is already established then every excitation will enlarge the connection. This can be summarised as Hebbian learning. Temporal difference learning methods need to predict future rewards. An alternate is differential hebbian learning, where synaptic plasticity depends on correlation between input and derivative of output signals (or spikes in precise). In 1986 Kosko and Klopf postulated such learning rule[3][4]. Hebbian came into limelight in 1997 when they were related to STDP by Markram and team. Memristors are 2-terminal solid state devices with adjustable resistance (conductance) that depends not only on external inputs but also on past history. Memristor is a very promising candidate for breaking the bounds of CMOS and Moore’s law [5] and to open up new horizons in computing and technology. Memristor arrays provide a high density analog memory technology intended for CMOS based Neuromorphic architectures. The properties of memristors make it possible to build ANN with desired connectivity, network density, power consumption and adjustable weights with memory. Memristors are ideal candidate for artificial synapses. Resistive switching behaviours of memristive devices under voltage pulse application allows to study essential requirements of individual memristive devices for the emulation of hebbian plasticity in neuromorphic circuits. In the remaining section II reviews electrical and mathematical models of memristors. Section III describes the existing literature and research on the use of memristors as a synapse in a physical neural network system. Section IV briefly reviews memristor characteristics, models and spiking neural networks, section V will finish with concluding statements. II. M ODELING THE MISSING ELEMENT Memristors are resistors (2-terminal passive device) which exhibit memory. They exhibit non-linear (resistance is dynamic which changes based on the previous history of current flow), non-reciprocal resistance (if direction of current flow changes, resistance alters) along with non-volatile memory (retain memory without power). Its property of varying levels of resistance challenges the current digital logic which has only two levels ’0’ and ’1’ (representing low and high). In 1971, Leon Chua postulated that electric charge and flux can be non-linearly related to give rise for a hypothetical missing 4th fundamental electrical component after R, L and C[1]. For a DC memristance is constant and behave as a regular resistance. The peculiar characteristic of memristor of improved performance with increased scaling unlike transistors (where every level of scaling poses huge limitations on performance) makes it a optimistic replacement for transistors. Quicker bootups, higher density (more than flash), high speed (speeds approaching SDRAM) and less voltage requirement (hence less dissipation) add to its advantage. There are two important ways to realise this hypothetical device namely: Molecular and ionic thin film method where the thin film atomic lattices exhibits hysteresis when charge

TABLE II T YPES OF M EMRISTORS Period 1975 1990 2004 2006 2007 2008 2011 2013 2014 2017

Pioneers M Julliere S Thakoor et al. Krieger and Spitzer Bowen et al. Stanly Williams Hynix semiconductor and Grandis Chen and Wang Aggeev Blinov et al. Bessonov et al. Campbell

Type Magnetic Tunnel Junction Solid state thin film memristor Polymeric memristor Extrinsic mechanism Titanium Di-oxide memristor Intrinsic mechanism (SSTRAM) Spintronic memristor CNT memristor Layered memristor Self directed channel

TABLE III M ODELS OF M EMRISTORS Period 1962 2008 2008

2010

Proposed by Vintage pickett HP D.B.strukov, G.S.Snider, D.R.Stewart and R.S.Williams Saigusa.T, Tero.A, Nakagak.T and Kuramoto.Y Benderli and Wey Joglekar and Wolf Bioleket al and team Benderli.S, Wey.T Pickett.M.D, Strukov, Snider, D. B. Borghetti, J. L. Yang J. J, G. S. Stewart D. R, Williams R. S Lehtonen et al.,E.Laiho

2011 2013 -

Prodromakis et al., and team Kvatinsky et al., and team Strukov et al.,

2008 2009 2009 2009 2009 2009

Model Pickett model Linear ion drift model Window function Ameoba learning model Benderli Joglekar Biolek Spice model Simmons Tunnel Barrier Nonlinear ion-drift model Prodromakis TEAM Strukov et al.,model

is applied and Magnetic and Spin based method which is opposite of the first; here resistance is function of electron spin which can be tampered by magenetization. T i02 memristors (explored mostly for designing and modeling), Ionic or Polymeric memristors, Resonant tunnelling diode memristors, and Manganite memristors belong to the first category of Ionic and molecular based memristors. Memristors used along with crossbar [6] latches can potentially replace transistors from computing devices. The dynamically changing memristance according to the applied stimulus is likely to imitate the synaptic behaviour hence memristors find their application in neuromorphic systems as bilogical synapses. III. P REVIOUS W ORK Human brain is fault tolerant and is only vulnerable during accidents and ageing. It is said that von neumann architecture mimics left brain and we are in pursuit of right part of brain which is responsible for being fault tolerant, reconfigurable and event driven. Brain inspired systems are the builidings blocks for low power and high density electronic synapse. Learning is exhibited by this part of the brain. To understand

TABLE I T IMELINE OF M EMRISTORS Period 1960 1962 1967 1971

Pioneers Bernard Widrow HIckmott Argall Leon Chua

1976

Leon Chua and Sung Mokang

2006 2008 2008

Scientists at HP labs Scientists at HP labs Strukov, Snider, Stewart and Williams

2009 2009

Scientists at HP labs Di Ventra, Pershin and Chua

2010 2010

Scientists at HP labs Scientists at HP labs and Hynix semiconductor Inc. Scientists at Knowm Inc.

2015

Developments An adaptive neuron using chemical memristors called ADALINE Low frequency negative resistance in thin anodic oxide films. Switching phenomenon in Titanium oxide thin films 4th fundamental, new 2-terminal circuit element characterised by relationship between q and ψ called Memristor Generalisation of the memristor and memristive systems along with zero crossing property in Lissajous curve (Pinched Hysteresis) characterising I vs V. Announcing the 1st nanoscale memristor based on T iO2 features and applications. 1st physical implementation by HP Nature identifying the link between the 2-terminal resistance switching behaviour found in nanoscale systems and memristors. proves that memristor can be stacked allowing 4-8X more memory on chip The notion of memristive systems of capacitive and inductive whose properties depends on the state and history of the systems. Memristors can perform digital logic in addition to storing data Enter into a joint development agreement to bring the memristor to market. Announced self directed channel (SDC) commercial memristor

the complex functionality of brain, we need to build one. This goal has been driving many neuroscientists, engineers, psychologists who have come together in a single umbrella of neuromorphic engineering. Neuromorphic engineering is an interdisciplinary domain which takes inspiration from neuroscience and tries to emulate the functionality especially brain. Miniaturised electronic devices built on the brain’s architecture could answer many neural disorders. We can understand the brain malfunction and implant artificial vision systems, auditory systems as in [5] etc.. to restore vision, audibility and many other cognitive behaviour. Study of brain will give us insights about the architecture and functioning of it which will help us to understand and imitate human behaviour in intelligent systems. Hui Wang, Hai Li proved high density, non volatility, and unique characteristics of memristor in neuromorphic architecture [7]. Ting Chang,Yuchao Yang and Wei Lu discussed that nanoscale memristive devices are the recent growth in neuromorphic hardware [8]. Biological communication happens on completely different set of principles as compared to existing engineering approaches. The fundamental difference is the input, which is in unspecified and non-linear format [9]. The processing is not based on absolute value of input (as it happens in our computers) instead on the relative values of inputs. The complex functionality exhibited by brain can be credited to the billions of neurons which are densely interconnected. The plasticity exhibited by these individual neurons is the reason why brain is capable of learning to do so many things. At the basic level a neuron can be abstracted as a signal (spike) detector. millions of signals are received by a neuron through the dendrites (which acts as input terminal), neuron try to make a specific meaningful pattern from the received chunk of data. Neuron has certain threshold and send a trigger

to next neuron only on finding something significant enough to surpass the threshold. This trigger can also be termed as spike or action potential. Axon is the transmitting channel which communicates with another channel regarding the action potential. Spikes are transmitted through axons, this axon in-turn connected to other neuron’s dendrites (or soma) via Synapses [10].Computation and adaption in wide-ranging artificial neural network is obtained by hebbian learning and synaptic plasticity. Synapses just like neurons are very complex in their structure and behaviour [11]. Synapses work as connection cites where connection occur between pre-synaptic neuron’s axon and post-synaptic neuron’s dendrite. The axon carries action potential (neuron output voltage). Synaptic connection is responsible for integration of information, which is dependent on type of neurotransmitter and time at which this neurotransmitter was received at post-synaptic neuron . This integration process of accumulation of electrical charges (analogous to capacitor) also called conductance whose time scales can tell us whether the neuron is more likely to fire a spike or less likely (excitatory or inhibitory) [35]. W Alan Doolittle E al., proposed n-type material exhibits inhibitory synaptic response and p-type would exhibit excitatory synaptic response in complimentary oxide memristor [13]. Ahmad Muqueem Sherin et al.,has proposed asymmetric properties of memristor in literature of neural architecture using memristive devices. Le Zheng et al., analysed the frequency and timing information of memristor interconnection implements the synapse model [14]. Due to adaptation exhibited by neurons for a changing environment, learning and memory emerges. The mechanism responsible for learning and memory is hebbian synaptic plasticity. This spike dependent plasticity has two classic approaches for induction of hebbian learning in mammalian hippocampus and neocortex. One of them is spike-rate-dependent plasticity

TABLE IV N EUROMORPHIC CORRESPONDENCE FOR IMPLEMENTATION HIERARCHY

Levels 1 2 3

Hierarchy Behaviour System Circuit

4 5 6 7

Component Device Membrane Protein/Genetic

Neuroscience Mind Brain System Local Neural Population Neuron Synapse Channel Ions Genes

Electronics Architecture Macro Model Block/Cel Perceptiron Memristors Transistors -

A comparision table showing brain’s counterpart of computers [20].

Fig. 2. Electrophysiology of Neuron and Neurophysiology of Neuron

(SRDP) and another is spike-time-dependent plasticity (STDP) [15]. SRDP protocols are more consistence with BCM rule postulated by Bienenstock, Cooper and Munro. According to BCM rule the magnitude and direction of change in synaptic plasticity is directly proportional to post synaptic activity as determined by pre synaptic rate. A train of high frequency presynaptic pulses brings in Long Term Potentiation (LTP) and low frequency pulse train results in Long Term Depression (LTD). STDP protocol argues that there is some information hidden in the timing of spikes. According to STDP learning rule: Precise time of occurrence of pre and post synaptic spikes, determine the direction and magnitude (strength) of change in synaptic plasticity. According to STDP rule: if pre-synaptic spike occurs before post-synaptic spikes (within a small time frame of about 40ms), it leads to LTP, if pre-synaptic spike occurs after post-synaptic activity then it leads to LTD. Body adapts through exercise and brain adapts by learning [16]. The most astonishing function of brain is its ability to learn and adapt to the existing environment. Learning is output of both passive experiences and active search of knowledge . Learning is an Emergent phenomena. Learning includes unlearning [17] said ”The principal activities of brain are making changes in themselves”. Learning happens by the virtue of plasticity of brain. Gopnick in his book [18] explains that Neural Network is like a telephone network, neuron transmits signals to target cells over long distances . For instance the information sensed from eye is routed to primary happens through synapses which grow from 2500 synapses/neuron to 15000 synapses/neuron from birth to early childhood, which sums to 100 trillion synapses in a brain. Learning takes place at synapses, the junction between neurons. when a new information is perceived it will be stored in short term memory which depends on chemical [19] (ion exchange) and electrical (spikes) events in brain [17]. As time precedes the information will be moved to long term memory which is accomplished by structural changes such as formation of new synapses . Maruan Al-

Shedivat and group had worked on probabilistic computation in spiking neural network which has an advantage of observing analysis and simulations of the neuron [21]. Spikes are membrane voltages, large voltage of about a few hundred mV will result in opening and closing of perticular membrane channels to facilitate exchange of ionic and molecular substances. Each synapses is characterised by ’synaptic weight’ w, this strength of synapse define the efficacy of a presynaptic spike in triggering an action in post-synaptic neuron. This phenomena was postulated by Hebb in 1949. this was the first machine learning algorithm which was more near in mimicking synaptic plasticity. Here the changes in synaptic weight ∆w was proportional to mean firing rates of pre-and post-synaptic neurons. STDP is a improvement (or evolved) of his hebbian rule. where more emphasis is given to precise relative timing of individual pre and post-synaptic spikes, instead of mere average rate over time. In STDP the increment/decrement of synaptic weight ∆w is proportional to time difference between post tpost and pre-synaptic spikes tpre . ∆w = ξ(∆T )where∆T = tpost − tpre

(1)

ξ is learning function which is dependent on ∆T not on synaptic weight w. This type of weight-independent learning rule in STDP is called additive. We may loose some information as learning cannot be completely independent of weight. Multiplicative learning rule in STDP takes weight into consideration and also include a weight dependent function which multiplies the additive learning function ξa . ξm (w, ∆T ) = F (w, (sign±)(∆T ))ξa (∆T )

(2)

STDP is set of learning rule (and mechanisms) originally formulated for artificial intelligence and proof of that was obtained through computational neuroscience. William chan et al., introduced the plasticity of memristor can be adjusted via current spikes of pre-synaptic and post-synaptic neuron spikes [22]. It is computational biology approach where much of the emphasis is given on relative timings of spikes. Gerstner reported first STDP algorithm in 1993. Hebbian is overtaken by STDP as it assist us in identifying the hidden spike pattern and its performing computative learning of spike pattern. Martin Ziegler et al.,has estimated locality, co-operativeness and

associvity are the primary requirements of Hebbian Learning in cellular level [23]. Neuromorphic engineering aims to reproduce the spikebased computation of brain (taking neurophysiology and neuroanatomy into account) using custom (analog-digital) Very Large Scale Integration (VLSI) circuits. Implementing neural networks on VLSI was the inception for the debate between analog and digital circuit [24]. The real world signal being analog, the ears and eyes are processing in the incoming signal directly without the need of digital conversion. Thousands of analog cells can work in parallel, leading to high number of operations per second. Analog computation treats transistors as complicated, non-linear devices with many physical considerations, which comply with neuro dynamics. Whereas digital computing considers transistor only as a switch. Wei Lu and team investigated the use of memristors for logic and memory and concluded that all the current logic and memory applications realised by CMOS can be potentially replaced by 2-terminal memristive device [25]. Melika Payvand found that non-volatile memory of memristor is capable of directly integrate the CMOS technology hybrid fashion [26]. Hai Li et al.,proposed Conventional CMOS technology and emerging device have been used in hardware implementation of spiking neuromorphic computing [27].Spiking numbers are used to demonstrate the computation results basing on the rate coding. Analog systems are more efficient than digital if precision is traded off. Idongesit Ebong et al., carried out studies on memristive synapse with a purpose to build analog neuromorphic chip. They found low-level computing element which can employ nano devices in on-site tuning variable [28]. But the keen observation suggest the neurons can be preceived as A/D converters [29]. The incremental conductance changes in analog memristors are controlled by charge flown through the device. As a neuron collects all incoming spikes and based on the threshold it decides weather a spike has to be transmitted (logic 1) or not (logic 0). Analog signals requires more circuitry hence more power. Our goal is to achieve Low Power consumption. Rajkumar and group proposed that electromagnetic theory of memristor and obtained equation in Laplace domain. Ultra low power consumption in sub threshold region is obtained with the help of voltage scaling [30]. Harika Manem suggested that TTGA(Trainable Threshold Gate Array) systems in reconfigure memristive synapse is obtained by variation-tolerant training methodology. And also Robinson E.Pino determine the fastest way of analytical modeling of 3D-memristor [7]. Harika Manem says that memristive model in 3D integration utilize the less power and permit high spped when compared with purely CMOS 2D implementation [31]. Digital systems are more power efficient, precised, flexible and most importantly immune to noise. They are robust systems and doesn’t change with temperature, power supply fluctuations and to variations in transistor behaviour. The question is should we go with analog systems or digital one. One may take inspiration from biology where initial low power

analog processing it sensory organs is followed by digital transmission of information towards brain. Signal processing on VLSI systems is an emerging field, where much of the attention is given for Mixed Mode signaling [29] [32] Beiye Liu et al.,proposed a noise-eliminating training method and digital-assisted initialization step to improve the training process robustness and the performance of memristor crossbar-based neuromorphic computing engine [33].The emerging neuromorphic computing systems successfully challenge by providing functionality reconfiguration as well as low power consumption. Chris Yakopic shows how increase in switching noise of crossbar in memristor model impact on learning in neuromorphic circuits [34]. Bonan Yan et al.,evaluate the impact of nonlinear resistive selectors on the computation robustness of a Hopfield spikebased pattern recognition system based on memristor crossbar technology.They presented a neuromorphic design implementation leveraging the one-selector-one-memristor (1S1M) crossbar[6]. IV. D ISCUSSION The domain of Neuromrphic engineering got huge momentum due to the emergence of 2-terminal resistive (switching) devices: Memristors. The quasi-continious tuning of device resistance with memory effects makes it possible to be used in neuromorphic circuits. To emulate brain, which has around 1010 synapses per cm2 , memristors have to be densely connected. This can be achieved by arranging memristors as crossbars (high density grid). With this layout one can achieve large density comparable to brain. The conductance modulation in memristors is analogous to biological weight changes in synapses. By mimicking the synaptic behaviour, naturally memristor has gained lot of attention. Memristors perform well as density increases; and are suitable to build electronic synapses where learning and memory effects are believed to take place. The memristors (at nanoscale) are capable of providing on-chip storage of upto 10Gb/cm2 [35], which is possible due to its excellent scalability, increased performance with increased density, and its non-volatile nature. Though threshold effect based memristors are not true memristors but are very much suitable for neuromorphic applications. However, it is still in developmental stages and needs much considerable improvements before a fully neuromorphic system is realised. Besides the possibility to emulate synaptic plasticity, there is a still gap between the performances of memristive devices w.r.t the practical performance (as compared to theoretical values) and the requirements for the effective implementation on neuromorphic circuits. Implementation of STDP rule using memristors needs careful design; here the challenge is to translate the relative spike timing into a control signal that regulates the memristor conductance. similarly to implement rate-dependent plasticity the challenge is to identify decaying element [36].

V. C ONCLUSION Memristor has the potential for breaking the barriers of CMOS and Moore’s law to open new avenues of computing to realise intelligence in machines. There is no second thought about its use as electronic synapse in neuromorphic hardware. Here we have provided a brief historical perspective of memristors. The various types and models of memristor are tabulated. Major contribution of this work is reviewing the use of memristor as synapse in neuromorphic devices. In discussion we have highlighted the potential and open research issues and challenges in using a memristor to mimic biological synapse. ACKNOWLEDGMENT We would like to thank Syed Aslam Ali, Bhoomika C.M and Abhinandan A.J.of REVA Univesity for their enduring support during this work. We would also like to thank Rukmini Education Trust for providing necessary infrastructure to carry out the research work. R EFERENCES [1] Chua, Leon. ”Memristor-the missing circuit element.” IEEE Transactions on circuit theory 18.5 (1971): 507-519. [2] HEBB, D. O. THE ORGANIZATION OF BEHAVIOUR. 1961. [3] Klopf, A. Harry. ”A drivereinforcement model of single neuron function: An alternative to the Hebbian neuronal model.” AIP Conference Proceedings. Ed. John S. Denker. Vol. 151. No. 1. AIP, 1986. [4] Kosko, Bart. ”Differential hebbian learning.” AIP Conference proceedings. Ed. John S. Denker. Vol. 151. No. 1. AIP, 1986. [5] Mead, Carver, and Lynn Conway. Introduction to VLSI systems. Vol. 1080. Reading, MA: Addison-Wesley, 1980. [6] Yan, Bonan, et al. ”A neuromorphic ASIC design using one-selectorone-memristor crossbar.” Circuits and Systems (ISCAS), 2016 IEEE International Symposium on. IEEE, 2016. [7] Wang, Hui, Hai Li, and Robinson E. Pino. ”Memristor-based synapse design and training scheme for neuromorphic computing architecture.” Neural Networks (IJCNN), The 2012 International Joint Conference on. IEEE, 2012. [8] Yang, Yuchao, et al. ”Observation of conducting filament growth in nanoscale resistive memories.” Nature communications 3 (2012): 732. [9] Teuscher, Christof, et al. ”Bio-inspired computing tissues: towards machines that evolve, grow, and learn.” Biosystems 68.2 (2003): 235-244. [10] Wu, Shu Hui, Chun Lei Ma, and Jack B. Kelly. ”Contribution of AMPA, NMDA, and GABAA receptors to temporal pattern of postsynaptic responses in the inferior colliculus of the rat.” Journal of Neuroscience 24.19 (2004): 4625-4634. [11] Cantley, Kurtis D., et al. ”Neural learning circuits utilizing nanocrystalline silicon transistors and memristors.” IEEE transactions on neural networks and learning systems 23.4 (2012): 565-573. [12] Qian, Ning, and Terrence J. Sejnowski. ”When is an inhibitory synapse effective?.” Proceedings of the National Academy of Sciences 87.20 (1990): 8145-8149. [13] Doolittle, W. Alan, W. Laws Calley, and Walter Henderson. ”Complementary oxide memristor technology facilitating both inhibitory and excitatory synapses for potential neuromorphic computing applications.” Semiconductor Device Research Symposium, 2009. ISDRS’09. International. IEEE, 2009. [14] Zheng, Le, Sangho Shin, and Sung-Mo Steve Kang. ”Memristor-based synapses and neurons for neuromorphic computing.” Circuits and Systems (ISCAS), 2015 IEEE International Symposium on. IEEE, 2015. [15] Tateno, T., and H. P. C. Robinson. ”Rate coding and spike-time variability in cortical neurons with two types of threshold dynamics.” Journal of neurophysiology 95.4 (2006): 2650-2663. [16] Blakemore, Sarah-Jayne, and Uta Frith. The learning brain: Lessons for education. Blackwell publishing, 2005. [17] Hopfield, John J. ”Brain, neural networks, and computation.” Reviews of Modern Physics 71.2 (1999): S431.

[18] Gopnik, Alison, Andrew N. Meltzoff, and Patricia K. Kuhl. The scientist in the crib: Minds, brains, and how children learn. William Morrow & Co, 1999. [19] Zucker, Robert S., and Wade G. Regehr. ”Short-term synaptic plasticity.” Annual review of physiology 64.1 (2002): 355-405. [20] Ahmed, Mohammed Riyaz, and B. K. Sujatha. ”A review of reinforcement learning in neuromorphic VLSI chips using computational cognitive neuroscience.” International Journal of Advanced Research in Computer and Communication Engineering 2.8 (2013): 3315-3320. [21] Al-Shedivat, Maruan, et al. ”Memristors empower spiking neurons with stochasticity.” IEEE Journal on Emerging and Selected Topics in Circuits and Systems 5.2 (2015): 242-253. [22] Chan, William, and Jason Lohn. ”Spike timing dependent plasticity with memristive synapse in neuromorphic systems.” Neural Networks (IJCNN), The 2012 International Joint Conference on. IEEE, 2012. [23] Ziegler, Martin, et al. ”Memristive Hebbian plasticity model: device requirements for the emulation of Hebbian plasticity based on memristive devices.” IEEE transactions on biomedical circuits and systems 9.2 (2015): 197-206. [24] Mead, Carver A., and Misha A. Mahowald. ”A silicon model of early visual processing.” Neural networks 1.1 (1988): 91-97. [25] Lu, Wei, et al. ”Two-terminal resistive switches (memristors) for memory and logic applications.” Proceedings of the 16th Asia and South Pacific Design Automation Conference. IEEE Press, 2011. [26] Payvand, Melika, et al. ”A configurable CMOS memory platform for 3D-integrated memristors.” Circuits and Systems (ISCAS), 2015 IEEE International Symposium on. IEEE, 2015. [27] Li, Hai, et al. ”The applications of memristor devices in next-generation cortical processor designs.” Circuits and Systems (ISCAS), 2015 IEEE International Symposium on. IEEE, 2015. [28] Ebong, Idongesit, et al. ”Multi-purpose neuro-architecture with memristors.” Nanotechnology (IEEE-NANO), 2011 11th IEEE Conference on. IEEE, 2011. [29] Salam, Fathi, Yiwen Wang, and Hwa-Joon Oh. ”A 50-neuron CMOS analog chip with on-chip digital learning: design, development, and experiments.” Computers & Electrical Engineering 25.5 (1999): 357-378. [30] Kubendran, Rajkumar. ”Electromagnetic and Laplace domain analysis of memristance and associative learning using memristive synapses modeled in SPICE.” Devices, Circuits and Systems (ICDCS), 2012 International Conference on. IEEE, 2012. [31] Manem, Harika, et al. ”An extendable multi-purpose 3D neuromorphic fabric using nanoscale memristors.” Computational Intelligence for Security and Defense Applications (CISDA), 2015 IEEE Symposium on. IEEE, 2015. [32] Mostafa, Hesham, et al. ”A hybrid analog/digital spike-timing dependent plasticity learning circuit for neuromorphic VLSI multi-neuron architectures.” Circuits and Systems (ISCAS), 2014 IEEE International Symposium on. IEEE, 2014. [33] Liu, Beiye, et al. ”Security of neuromorphic systems: Challenges and solutions.” Circuits and Systems (ISCAS), 2016 IEEE International Symposium on. IEEE, 2016. [34] Yakopcic, Chris, and Tarek M. Taha. ”Ex-situ programming in a neuromorphic memristor based crossbar circuit.” Aerospace and Electronics Conference (NAECON), 2015 National. IEEE, 2015. [35] Wang, Qian, Yongtae Kim, and Peng Li. ”Architectural design exploration for neuromorphic processors with memristive synapses.” Nanotechnology (IEEE-NANO), 2014 IEEE 14th International Conference on. IEEE, 2014. [36] Chang, Ting, Yuchao Yang, and Wei Lu. ”Building neuromorphic circuits with memristive devices.” IEEE Circuits and Systems Magazine 13.2 (2013): 56-73.

Suggest Documents