A new radial basis function networks structure

8 downloads 0 Views 324KB Size Report
This article describes a new structure to create a RBF neural network; this new structure has 4 main characteristics: ... function approximation. Other important ...
A New Radial Basis Function Networks Structure: Application to time series prediction. I.Rojas, H.Pomares, J.Gonzalez, E.Ros, M.Salmeron, J.Ortega,A.Prieto Department of Architecture and Computer Technology. University of Granada. Spain.

Abstract This article describes a new structure to create a RBF neural network; this new structure has 4 main characteristics: firstly, the special RBF network architecture uses regression weights to replace the constant weights normally used. These regression weights are assumed to be functions of input variables. The second characteristic is the normalization of the activation of the hidden neurons (weighted average) before aggregating the activations, which, as observed by various authors, produces better results than the classical weighted sum architecture. The third aspect is that a new type of nonlinear function is proposed: the pseudo-gaussian function (PGBF). With this, the neural system gains flexibility, as the neurons possess an activation field that does not necessarily have to be symmetric with respect to the centre or to the location of the neuron in the input space. In addition to this new structure, we propose, as the fourth and final feature, a sequential learning algorithm, which is able to adapt the structure of the network; with this, it is possible to create new hidden units and also to detect and remove inactive units.

I. INTRODUCTION The output of the classical RBF neural networks is defined as the linear combination of the radial basis function layer, as follows:

where the radial basis functions ai are the nonlinear functions, usually gaussian function [4]. The structure of the neural system proposed is modified using a pseudo-gaussian function (PG) in which two scaling parameters U are introduced, which eliminates the symmetry restriction and provides the neurons in the hidden layer with greater flexibility with respect to function approximation. Other important characteristics of the proposed neural system are that the activation of the hidden neurons is normalized and that instead of using a single parameter for the output weights, these are functions of the input variables which leads to a significant reduction in the number of hidden units compared with the classical RBF network. The -* output FRBFof the neural network is:

j?* ( x , ) =

‘=Iv

;withw; = y . b

Being the output of a hidden neuron is computed as:

449 0-7695-0619-4/00 $10.00 0 2000 EEE

where : U(x”;;a,b)=

1 ifa

Suggest Documents