Review of Spiking Neural Network Architecture for ... - IEEE Xplore

0 downloads 0 Views 150KB Size Report
Abstract: To explore the main components of the future computing machines, the spiking neurons for feature extraction and dimensionality reduction applications ...
2011 Fourth International Conference on Emerging Trends in Engineering & Technology

Review of Spiking Neural Network Architecture for Feature Extraction and Dimensionality Reduction Mrs.Soni Chaturvedi

Dr.(Mrs.) A.A.Khurshid

Deptt. Of E&C Engg. PIET Nagpur

Deptt. Of EN/ E&C Engg. JDCOE Nagpur

email id: [email protected]

email id: [email protected]

Abstract: To explore the main components of the future computing machines, the spiking neurons for feature extraction and dimensionality reduction applications. The contribution would be to present a review of the approaches to spiking neural network architecture used for feature extraction and dimensionality reduction applications. To give importance to more realistic neuron models the main objective is to present a general and a comprehensive overview of spiking neurons, ranging from biological neuron features to examples of practical applications in the mentioned field.

II. System Overview The component of generic spiking neurons presynaptic and postsynaptic action potential lead to mathematical models which can be used for simulations from simple to complex models. The neurodynamics allow relative information processing for fast computation. The proposed work would focus on designing an efficient algorithm to reach to minimal number of operations and it may be extended to models which include gap function or dendritedendrite interactions. Also, all synaptic variables sharing the same linear dynamics could be reduced to a single one thereby reducing the cost of update and the effort could be directed to design suitable algorithms for more complex models or more realistic models that are suitable for feature extraction, dimensionality reduction and clustering applications.

However, this work will focus on how information can be coded by precisely timed spikes, emitted by different neurons and then this coded information would be processed to produce useful results for feature extraction and dimensionality reduction application .Also, different approaches/algorithm would be studied and compared in terms of computational efficiency as the range of computational problems related to spiking neuron is very large. Therefore, the efforts would also be directed towards the reduction of computational cost.

A. Spiking Neural Networks(SNN) Basics When a positive input current is applied to the membrane of neuron, it generates spiky signal of the membrane potential. The spike frequency depends upon the amplitude of the injected current. Therefore, an analog signal can be encoded into modulated spike trains which is known as rate coding [1].An activity function can be defined with aon and aoff during positive and negative half coding respectively. This paper suggests method for encoding of an analog signal using biophysically realistic neural networks. Such a network differs from a standard artificial neural network because of the fact that a biological cell generates spikes and information can be encoded as an activity of this spike generator and transmitted through a synapse between two cells. Thus, a biological neural network can be used a dynamic ensemble of cells that interact, perhaps to approximate a function, perform a recursive computation such as solving a differential equation, or retain a variable in its memory. The interaction between the cells is controlled by choosing a set of synaptic weights that have to be optimized in order that a portion of the network encodes a suitable function. An integrated model that reproduces spiking and bursting behavior of known types of cortical neurons can be suggested which combines the biologically plausibility of Hodgkin-Huxley-type dynamics and the computational efficiency of integrate-and-fire neurons[2], which allows faster simulation, of the order of tens of thousands of spiking cortical neurons in real time.

Keywords: S NN, activity function, membrane potential integrate and fire, S pike Response Model, Temporal coding, spike sorting

I. Introduction The human brain is made of hundreds of billions of cells, to be specific, the basic nervous cell; the neuron has many different forms and is spread over several anatomically different structures, like brain stem, cerebellum and cortex. A generic Spiking Neuron is the elementary processing unit in the brain which are the neurons and are connected to each other in an intricate pattern and can occur in many shapes and sizes. The generic neuron [1] has four functionally distinct parts, namely, dendritic tree, soma, axon and synapse. The signals from other neurons are collected by the dendrites(input device)and are transmitted to the soma(central processing unit),if the input is sufficient, that is, it exceeds threshold, an output signal(action potential or spike) is emitted and propagated along the axon(output device) and its branches to other neurons. It is the transition zone between the soma and the axon, the axon hillock, where the essential non–linear processing step occurs. The sending neuron is referred to as the presynaptic neuron and receiving neuron is referred to as the postsynaptic neuron.

Two different models of integrate and fire neuron for intrinsically bursting and fast spiking behaviour are described for simulation of specific[3] function in a specific region of the brain. The model may be applied to a large number of neurons collectively.

Although much progress has been achieved in the last two decades, few fundamentals like how do real neurons transmit information or how to use spike timing efficiently to process information, still require further study. Rather, realistic and complex neural models have been used since long to simulate real networks, but always with the emphasis on spike rate. Thus, a shifting from Artificial neural network to practical spiking neural networks as emerging field has lead many researchers to consider pulse-coupled neural networks with spike-timing as an essential component for information processing by neural networks.

978-0-7695-4561-5/11 $26.00 © 2011 IEEE DOI 10.1109/ICETET.2011.57

The computational power of neural networks based on temporal coding [4] by spikes has been studied and proved that simple operations on phase-differences between spike-trains provide a powerful computational tool. The practical application of the neuron models to simulate dynamic behavior of real neural systems needs further study and the information processing is fast as the information from one neuron flows to another neuron through synapse. After analyzing the dynamics of a homogeneous population of spiking neurons, a rapid signal transmission [5] can be achieved.

317

spectro-temporal data[10,11]. Speech samples containing spoken isolated digits from the TI46 database were employed to demonstrate the way in which these features can be extracted using leaky integrate-and-fire spiking neurons with dynamic synapses. The flexibility that the additional synaptic parameters in the neuron model provide, is demonstrated to be essential for onset, offset and coincidental firing extraction. Recurrency and the interaction between excitation and inhibition together with latency is demonstrated to be a viable means of extracting offset features. For a neural recording, a single electrode usually receives electrical signals from multiple neurons simultaneously. For applications such as neural prosthetics and neuroscience research, spike sorting is a critical step in neuronal signal processing for two reasons. The first reason is scientific: Adjacent cells may encode completely different information. Neuroscientists often need to know which spikes come from which neurons in order to understand the neuronal circuitry, and brain–machine interfaces often depend on single-unit activity as input. The second reason is practical: data reduction [11]. Recent advances in BM I technology allow for the recording of hundreds of channels simultaneously. At the same time, there is a growing demand for the wireless transmission of data. Communication bandwidth and power limitations necessitate on-chip data reduction before transmission, and spike sorting is one way of achieving it.

The cut-off frequency of signal transmission depends on the noise level and on synaptic time constants rather than on the membrane time constant.

B.Segmentation and Clustering SNN uses precise firing times of neurons for information coding .Various models like threshold and fire, spike response model can be used for segmentation, which includes basic three issues network architecture, information coding and learning method[6]. For this,a fully connected feedforward network with temporal coding and winner takes all algorithm may be considered[7] in order to generate a learning window which defines various parameters like membrane potential with respect to Gaussian function. M oreover, SNNs add a new dimension, the temporal axis, to the representation capacity and the processing abilities of neural networks which can be applied with efficacy in image segmentation. The detection of transient responses, i.e. non–stationarities, that arise in a varying and small fraction of the total number of neural spike trains recorded from chronically implanted multielectrode grid turns complex with the rise in number of electrodes. This paper presents a novel application of an unsupervised neural network for clustering neural spike trains with transient responses [8]. This network can be constructed by incorporating projective clustering into an adaptive resonance type neural network architecture resulting in a PART neural network. Since comparisons are made between inputs and learned patterns using only a subset of the total number of available dimensions, PART neural networks are ideally suited to the detection of transients. It is shown that PART neural networks are an effective tool for clustering neural spike trains that is easily implemented, computationally inexpensive, and well suited for detecting neural responses to dynamic environmental stimuli.

F. SNN for Dimensionality Reduction In the context of spike sorting, Dimensionality Reduction is the process of reducing the number of features that will be used to cluster the data. Dimensionality Reduction significantly reduces the required memory and computational complexity of clustering. These reductions would result in significant reductions in area and power of the spike-sorting chip. Dimensionality Reduction [11] reduces the output data-rate of the chip when configured to output features only. Dimensionality Reduction improves the accuracy of clustering. It yields the best performance at the lowest hardware cost, we must determine the optimal dimensionality (i.e., the optimal number of feature coefficients to keep for clustering), in terms of accuracy and hardware cost.

C. Coding techniques The speciality of SNN is in the use of temporal coding[10,13] to pass information between network units. Using such codes allows the transmission of a large amount of data with only few spikes, simply one or zero for each neuron involved in the specific processing task. The true deal is how to encode analogical information to a spikes train. M ore, it’s not the only problem which we find in using spiking neurons network (SNN), we have to choose different parameters and functions. In this investigation in the middle of several spiking neurons models the spiking response model (SRM ) was chosen to apply in phonetic classification [9] using phonemes from TIM IT databases.

III. Conclusion & future directions During last few decades, the shift of emphasis in Artificial Neural Networks is towards Spiking Neural Network (SNN). Due to biological discoveries much of research progresses towards pulse coded neural network with spike timing Neural Network. For information processing by the brain the concept of neurodynamics for each model of SNN [2] use a different approach. M any simplified models have been proposed as variants of the integrateand-fire-model for an intrinsically bursting neuron and for a fastfiring neuron [3] which considers refractory periods . Such models may be compared [table I] and used for specific application. Apart from this, the specific behavior of these SNN is their applications in temporal coding for transfer of information between network units. This coding [4] method allows the transmission of a large amount of data with only few spikes. Biological neurons communicate via sequences of calibrated pulses or spikes. The SNN behavior utilizes input spikes from pre synaptic neurons which are weighted and summed up yielding a value called membrane potential which are time dependent and decays when no spikes are received by the neuron. If however, spikes excite the membrane potential so that a threshold is exceeded, a spike is emitted & transmitted, through axon via synapses [10] to other neurons. After the emission of spike the neuron is unable to spike again for a certain period called

D. Algorithm for spike sorting Spike-detection algorithms involve two main steps: 1) Pre-emphasis of the spike and 2) application of threshold The three different methods of pre-emphasis: absolute value, nonlinear energy operator, and stationary -wavelet-transform product [11,15] may be employed depending upon the application type followed by the process of threshold determination.

E. SNN for feature Extraction This investigations reveal that a spiking neural network-based can be used to extract features and the associated issues are related with extraction of onset, offset, and coincidental firing features from

318

refractory period. Recent work also proposes that the computational power of a Neural Network based on temporal coding by spikes trains [8] provide a powerful computational tool. The dynamics of homogeneous population of spiking neurons leads to rapid signal transmission [5] . The cutoff frequency of signal transmission depends on the noise level and on synaptic time constants rather than on the membrane time constant.

SNNs while keeping the properties of traditional connectionist models. One important fact that the image processing case studies highlight is that traditional preprocessing techniques do not provide optimal front-ends and back-ends for subsequent SNN processing and therefore can be a promising area of research.

Another progress in this area proposes a model which reproduces spiking and bursting behavior of known time of cortical neurons [2,3]. The model combines the biologically plausibility of Hodgkin-Huxley-type and the computational efficiency of integrate and fire neurons[2,9], with the help of which one can simulate tens of thousands of spiking cortical neurons in real time also applicable to feature extraction ,dimensionality reduction and clustering applications[7,12] .Few more applications like brain–machine interfaces[11,14] require hardware spike sorting in order to obtain single-unit activity and perform data reduction for wireless data transmission. Such systems must be low-power, low-area, highaccuracy, automatic, and able to operate in real time. Several detection, feature-extraction, and dimensionality -reduction algorithms for spike sorting are described and evaluated in terms of accuracy versus complexity. The nonlinear energy operator may be used for optimal spike-detection algorithm, being most robust over noise and relatively simple. Discrete derivatives can be applied for the optimal feature-extraction method, maintaining high accuracy across signal-to-noise ratios with a complexity order of magnitude less than that of traditional methods such as principal-component analysis. M aximum-difference algorithm give best results for dimensionality-reduction method in hardware spike sorting. Also, feature optimization can be considered as an important preprocessing tool in classification task.

VI. References

IV. Limitations In the field of real neural systems, the advancement in computer power has ensured a trend to simulate large population to study their behavior. The elaboration and application of the more detailed neuron models to engineering tasks is still in its beginning and there is lot of work to be done to make the present work more tractable. Though this review would contribute towards single synapse approach for reducing the computational cost still it would be a beginning in the field of real neural systems and cognitive science. Despite broad range of possibilities this work is limited to the use of multiobjective optimization techniques to reconfigure spiking neuron models.

[1].

Nenadic,Z.; Ghosh,B.K.,Computation with biological neurons”, American Controls Conference Proceedings, Vol.1 ,Pages-257-262,2001.

[2].

Eugene M . Izhikevich, “Simple model of spiking neurons”, IEEE Transactions on Neural Networks, 14(6):1569–1572, Nov 2003.

[3].

Sakari Inawashiro, Shogo M iyake, M akoto Ito, “Spiking Neuron M odels for Regular-Spiking, Intrinsically Bursting and Fast-Spiking Neurons “,6th International Conference on Neural Information Processing,1999. Proceedings, ICONIP '99. Vol.1, pages 32-36, 1999

[4].

Nicolas Langlois, Pierre M iché, and Abdelaziz Bensrhair,”Analogue circuits of alearning spiking neuron model”, in IJCNN 2000. International Joint Conference on Neural Networks, volume 4, pages 485–489, Jul 2000.

[5].

Wolfram Gerstner,” Rapid signal transmission by populations of spiking neurons”, in ICANN 99, Ninth International Conference on Artificial Neural Networks, vol. 1,pages 7–12, Sep 1999.

[6].

B. M eftah, A. Benyettou, O.Lezoray, and W. QingXiang,”Image Clustering with Spiking Neuron Network “, IEEE 2008.

[7].

S.M .Bohte, H. La Poutre and J.N.Kok, “Unsupervised Clustering with Spiking Neurons by Sparse Temporal Coding and M ulti-Layer RBF Networks”, IEEE transactions on neural networks, 13(2), 2002.

[8].

John D. Hunter, Jianhong Wu and John G. M ilton,”Clustering Neural Spike Trains with Transient Responses “,Proceedings of the 47th IEEE Conference on Decision and Control Cancun, M exico, Dec. 9-11, 2008.

[9].

S.E Lacheheb, A. Benyettou, “Phonetic classification with spiking neural network using a gradient descent rule”, second international conference on Computer & Electrical Engineering, IEEE Computer society ,2009.

[10].

Glackin, C, M aguire,L, M cDavid L. “Feature Extraction from spectro-temporal signals using dynamic synapses, recurrency and lateral inhibition”, IJCNN 2010,pages 16,2010

[11].

Sarah Gibson, Student M ember, IEEE, Jack W. Judy, Senior M ember, IEEE, and Dejan M arkovic´, M ember, IEEE, “Technology-Aware Algorithm Design for Neural Spike Detection, Feature Extraction, and Dimensionality Reduction, “ IEEE transactions on neural systems and rehabilitation engineering, Vol.18, no. 5, October 2010.

V. Implications In view of the above literature survey, it is observed that Spiking Neural Networks (SNN) are a more accurate model of biological neural networks. Also fewer neurons are needed to accomplish the same and can approximate any function with this model. But precise timing of individual spikes is essential for efficient computation and hence needs attention. Therefore it is still an open problem to determine efficient neural learning mechanisms that enable implementation of these particular time-coding schemes. Still any of the proposed modifications do not enable learning of patterns composed of more than one spike per neuron. Another disadvantage, common to all algorithms, is that the computation is very time consuming. The main challenge is to discover efficient learning rules that might take advantage of the specific features of

[12]. Booij O. :“Temporal Pattern Classification using Spiking Neural Networks” M aster’s thesis Artificial Intelligence,

319

specialization Intelligent Autonomous Systems, Van Amsterdam university, 2004. [13].

B, Schrauwen and ], Van Campenhout, "BSA, a fast and accurate spike train encoding scheme," Proc. International Joint Conference on Neural Networks, vol. 4, pp. 28252830, 2003.

320

[14].

Obeid and P. D. Wolf, “Evaluation of spike-detection algorithms for a brain-machine interface application,” IEEE Trans. Biomed. Eng., vol.51, no. 6, pp. 905–911, Jun. 2004.

[15].

S. M ukhopadhyay and G. Ray, “A new interpretation of nonlinear energy operator and its efficacy in spike detection,” IEEE Trans. Biomed. Eng., vol. 45, no. 2, pp. 180–187, Feb. 1998.

Table I: Comparison of the neuro-computational properties of spiking and bursting models

Type Of SNN Model

No. Of Co mputati onal Units

Hodgkin & Hu xley type Model [1]

400

Model combines the biologically Plausibility of Hodgkin Hu xley type dynamics & the Co mputational Efficiency of integrate & t ie neuron [2] Integrate & Fire Model for the Intrinsically Bursting and Fast Spiking Neuron [3]

SNN Model-Integrate & Fire Model [4]

Spike response Model [5]

Basic Spike Response Model/Threshold and Fire [6]

Basic Spike Response Model - Leaky Integrate & Fire neuron [7]

1000

Preprocessing Techniques used/Algorithm Applied Optimization Algorith m to optimize the synaptical weights

Application

To build capable of collective and rythms and neurons

networks exh ibiting dynamics excitatory inhibitory

Using this model one can simu late tens of thousands of spiking cortical neurons for pulse coupled network in real time ;

a)AHP-A fter hyperpolarisation b)ADPAfterdepolarisation of membrane potential fo llo wed by temporal development Weights are modified to increase the input potential and threshold followed by temporal encoding

IB & FS neuron is used in the simu lation of a function in a particular region of the brain

Regular Spikingsingle spike; Intrsically Bursting250 Hz, 20mV for 15 mS; Fast Spiking-400 Hz, 30 mV fo r 25 mS

As a Powerful Co mputational Tool - Integrator Circu it

Probability density theory for exact firing time of neuron on application of input potential considering escape noise; Winner takes all algorith m

Anti-correlat ions lower the noise spectrum in a given frequency range resulting in less noise

Information Coding using timing of spikes with frequency 1 to 10 KHz, increased computational speed and threshold voltage. Signal transmission is faster , noise free with only 100 neurons

a)Hebbian learning rule b)Winner takes all algorith m

Fisher’s Iris Data set

a)Perform Co mputation

Results

Recursive

b)Solving Differential Equation

Image Seg mentation

a)Unsupervised Clusteringspiking RBF; K- Means; Self Organizing Mapping b)Hierarchical clustering – tuning the no. of neurons to no. of components and clusters to achieve correct classification

321

A Biolog ical Neural Network synthesis of memo ry with 40 computational units , encoding of analog signals Simu lation results using1000 SNN: α frequency-8 to 10 hz and ┌ frequency-up to 40 hz

a)Mean square error=115.596 b)Peak SNR=91.357% c)Mean abs error= 7.712 d)Normalized colour diff=0.038 The Model exhibits Unsupervised Clustering effectively using spike time Coding and neural informat ion processing.

Adaptive resonance type Neural Network – Hodgkin Hu xley neuron [8]

Basic Spike Response Model - Leaky Integrate & Fire neuron [9]

Leaky Model [10]

Integrate & Fire

Basic Spike Model [11]

Response

13 input neurons and 26 hidden layer neurons for phonetic Classificat ion

Projective Clustering of ART Netwo rkPART Clustering Algorith m

a)The PA RT Network is highly effective for clustering neural spike trains ,used for spike sorting application and Classification for parameters related to frequency content of EMG and EEG Signals.

Descent of gradient algorith m

Different phonemes resulting in the range of 49.01 % to 67.30% recognition rate.

a)Spiker Algorith mA to D conversion-1. Hough Sp. Algo 2. Ben’s Sp. Algo b) Spike Timing Dependent Plasticity Algo.(STDP) a)Non lin. Energy Operator Spike Detection Algorith m(Sp ike Sorting) b)Discrete Derivative Method (Feat. Ext r.) b)Maximu m Difference Algorith m(Dimen. Reduc.)

Silence Feature removal and boosting up threshold of speech sample.

322

Brain Machine Interface-Neural Signal delivering output in terms of Less area, Low Power ,high accuracy

The developed PART network provides powerful method for clustering of neural spike trains that can be easily implemented, computationally inexpensive and well suited for identifying responses in neural spike trains related to transient. Encoding info and learning algorithm also affects converging process for different phonemes .Result obtained in the classification case are in the preliminary state & that our work amounts to a trial phase to take into account the effect of the noise through NIMIT data bases SNN Capable of learning & recognizing Spectro Temporal Patterns

Area scales linearly with the no. of channels-Less than12 sq.mm for 100 Channels