University of Otago Te Whare Wananga O Otago
Dunedin, New Zealand
FuNN/2 – A Fuzzy Neural Network Architecture for Adaptive Learning and Knowledge Acquisition Nikola K. Kasabov Jaesoo Kim Michael J. Watts Andrew R. Gray
The Information Science Discussion Paper Series Number 96/23 December 1996 ISSN 1172-455X
University of Otago Department of Information Science The Department of Information Science is one of six departments that make up the Division of Commerce at the University of Otago. The department offers courses of study leading to a major in Information Science within the BCom, BA and BSc degrees. In addition to undergraduate teaching, the department is also strongly involved in postgraduate programmes leading to the MBA, MCom and PhD degrees. Research projects in software engineering and software development, information engineering and database, artificial intelligence/expert systems, geographic information systems, advanced information systems management and data communications are particularly well supported at present. Discussion Paper Series Editors Every paper appearing in this Series has undergone editorial review within the Department of Information Science. Current members of the Editorial Board are: Mr Martin Anderson Dr Nikola Kasabov Dr Martin Purvis Dr Hank Wolfe
Dr George Benwell Dr Geoff Kennedy Professor Philip Sallis
The views expressed in this paper are not necessarily the same as those held by members of the editorial board. The accuracy of the information presented in this paper is the sole responsibility of the authors. Copyright Copyright remains with the authors. Permission to copy for research or teaching purposes is granted on the condition that the authors and the Series are given due acknowledgment. Reproduction in any form for purposes other than research or teaching is forbidden unless prior written permission has been obtained from the authors. Correspondence This paper represents work to date and may not necessarily form the basis for the authors’ final conclusions relating to this topic. It is likely, however, that the paper will appear in some form in a journal or in conference proceedings in the near future. The authors would be pleased to receive correspondence in connection with any of the issues raised in this paper. Please write to the authors at the address provided at the foot of the first page. Any other correspondence concerning the Series should be sent to: DPS Co-ordinator Department of Information Science University of Otago P O Box 56 Dunedin NEW ZEALAND Fax: +64 3 479 8311 email:
[email protected]
F uNN/2-
A
Fuzzy Neural
Learning Nikola
K
and
Network
for
Adaptive
Knowledge Acquisition
Kasabov, Jasco Kim, Michael
Department ot"
J Watts, Andrew
information
University o1‘()tago, P.0.Box
Phone:
Architecture
R
Gray
Science
56, Dunedin, New Zealand
+64 3 479 8319, Fax: +64 3 479 831 I, email:
nkasabov
@otago.ac.nz
Abstract
Fuzzy
neural
networks
have
several
features
knowledge engineering applications. generalisation capabilities,
fuzzy rules, problem
and the
under
excellent
ability
to
These
This
FUNN
_
As
well
as
architecture, FLINN version
modified
of FuNN
~
providing also
for
facilities
both
them
include
well
suited
fast and
in the fonn
data and
of
to
a
particular
representing
a
incorporates a genetic algorithm
is also
range
ol’
a
system in
one
fuzzy with of
ol
learning, good
semantically meaningful
its
FuNN/2, which employs triangular membershipfunctions
learning and adaptation algorithms,
wide
existing expert knowledge about
model
fuzzy
a
accurate
investigates adaptive learning,
paper
insertion, and neural/fuzzy reasoning for
make
strengths
explanation
accommodate
consideration.
that
presented in the paper,
rule
neural
an
extraction
and
network
called
adaptable
neural
adaptation and
the
modes.
A
correspondingly
1. Introduction
Different
architectures
of
engineering technique successfully used and control
adjust to
and
neural
used
for
networks various
Some
publications [3,4,5]
the
rules
and situations
general architecture
[ l~7]. PNN
well
as
publications suggest
recent
dynamically changing data
or
(PNN) have been proposed
applications
learning and tuning fuzzy
problems.
new
In several
for
fuzzy
ol
as
a
knowledge have
systems
been
solving classification, prediction
as
methods
for
training
FNNS
in order
[4, 5, 131.
a
PNN
called
FiuNN, which
stands
was introduced along with algorithms for learning, ljgzzy l§leural i\l_etworlwasintroducedalongwithalgorithmsforlearning,adaptationandrules adaptation and
extraction
and the first
principles
and
one
version
ol’ its
implementation. Here
particular implementation called
iearning and adaptation strategies associated
some
rule extraction
several
algorithms.
paradigms in
one
One of the
unique aspects
system, ie neural
paradigm approach has produced good
2. The
The
Architecture
FuNN
model
environment.
as
one
it
fuzzy
The
of FuNN
is
designed
architecture
rule extraction
be
facilitates
and insertion.
fora
for
environment). dii t‘ercnt
several
methods
Considerable
of
are
development ol
for
rules
the FuNN
presented. The paper investigates
FuNN/2
which
ol the FLINN
wide range of
include
architecture
rule insertion
and
is that it combines
multi-
problems.
FUNN/2
used
in
a
learning from It allows
distributed, data and
and
associated
ol both
with
the two
adaptation (adaptive learning
experimentation
can
adaptation strategies.
7 A4
be carried
out
eventually agenbbascd,
approximate reasoning, as
for the combination
system, thus producing the synergistic benefits allows
further
networks, fuzzy logic, genetic algorithms. This
results
and
to
liuNN/2 with
to
in
a
using
data and rules
sources.
In
well
into
addition,
dynamically changing this
system
to
evaluate
FUNN
uses
a
multi~layer pereeptron (MLP) The
algorithm [3,4,5]. feed’l‘orward functions
of the
the FuNN
zu;
shown
in
fuzzy predicates, as
ztrehitiecture
and the
architecture 1.
figure
well
as
new
data.
philosophy
and
consists
It is
the
modified
a
fuzzy
of
layers
adaptable PNN
an
rules
Below
behind
of 5
baeltraining
inserted
brief
a
before
the
training
description of
this architecture
with
neurons
where
training partial
membership or
adaptation,
the components
of
given.
are
Input Layers
The
to
connections
Fui\ll\I
adapt and change according to
may
2.1
general
network
ol
input layer
the condition
represents the
neurons
elemezif
layrfr
which
variables
input
pert orms fuzzification.
using three-point triangular membership functions this
condition
element
attached
points
adjacent
to
The
layer.
triangles
centers,
or
with
This
centers
shouldered
in the
is
These
the
the rninirnurri
of the first
ease
values
implemented
represented as
with
completed
are
values.
crisp
as
and
in fPuNN/2
weights
and
last
fed
are
into
maximum
membership
t‘unetions.
The
triangular membership
will
belong
will
always involve
which
case
to
maximum
at
the
that
These
some
Using triangular procedures
This
in FuNN
centenbased
regions
in
the
ol two
unless
rules
allowed
are
the
input
value
falls
he
given
the
membership Functions fast without
that
can
makes
taken
exist
any
with
on
with
from
membership function is
equality
fire
to
for
will
all
ofthe
avoids
flexible
zero
(it
center
in
unlikely given floating always
points
ftJzzil ication
fl* tiNN/2
more
ll
value
input
any
degrees differing
given input
the accuracy
by
3
the
and
non-symmetrical
exactly
opportunity
compromising
membership approach input space
be
be activated, but this
membership degrees for will
to
membership functions
single membership will
point variables).
ensuring
two
functions
and
sum
in the
the
up to
input
one,
space.
rleftizzifiezitiion
solution.
the
problems
of uncovered
membership representation
l
W ll
Condition
ll
elements Rea] Values
layer
input layer
(sc, at
,
DI CODl1CCfiOI1S
_;~‘;
» ,," _
t
_
(SO,aG, OO)
,
connections
-
R1
_
4- ~’~»
CF
Futput ay"
layer (SA a/\, OA)
(SRJaR9()R)
A1
"
elements
layer
ot)
Real!VHIUGS
Action
Rum
v
‘
QF1
_ l n
t
’f *Z
>." "
\\\\_t
B1
X l
_---
-‘
W___»F
-V,
\..._»l}§§/___‘f, \\\\\)_‘\
..__
AAA ‘_
-
/f
>
’
-_-~»_~~.._--
/
----
Ca
’ -’~-~
pre»~set connections initial value is zero
" ‘*"
"""
"
optional connection
Figure
1. A
(DIN)
THEN
where
DIS
factors
represent
are
structure
y is
for
C| (CF|); Rjjf IF
degrees of importance
attaczhed
to
specific
the consequent for the
initial
two
xl
is A;
attached
parts ofthe
tilzzy (Dim) to
111105: RI: IF xl and
X2 is
the Condition
rules
is A;
B2 (Dlghg)THEN elements
and CFS
(adopted from [3,4,5]).
layer summation, activation,
(DIN)
and output functions.
The
and y is
are
xg is
B,
C2 (Clig),
confidence
triplets triplets
These
strategies.
Wliile
coverage.
memberships efficient; bias
and
carried
out
as
the
Initially
semantic
the
ordering
representing than
high .
The
condition
more
activation
appendix
of the
to
partitions the
low
set
will
element
nodes
functions A forthe
in
center-based
to
as
way
such
It should
force
to
cases
approach
complete
ensure
both
seems
be noted
that there
the
more
are
no
representation in FuNN/2.
element
layers of this
to
neurons
take values
can
This
range.
this
Under
for
the
of
the
ol
weight
in
{0,l] only
normalisation
is
be
and
can
have
ol‘
neurons
more
in the
algorithm
a
the
weight
initialisation.
performed
normally reverse
layer
architecture
Another
space.
is
such
centers
center
One
can
less than
possibility
the
constraint
inedium ,
condition
functions
element
of the PuNN/2
for
nodes
the
to
implementation).
4
is
weights
to
are
spatially
within
certain
a
monotonic
membership function which
potentially expandable during
membership
centers
will
the
input
the
restrictions
some
be attached the
il’ any
maintain
to
is that remain
alternative
as
order
of connections
labels
although
space,
In
adaptation is taking place
membership
always
used
FLINN/2
When
over
in this
constraining rules.
some
representing
full
be used
can
memberships contained
layer
are
spaced equally
are
is constructed.
according
within
used
pieprocessing operations, and
is available
network
sized
and
a
applications.
adaptation.
on
in such
restrictions.
normalised
membership functions
constrained
equally
be
to
simple
arbitrary
condition
to
formulated the
space,
fewer
part of the FuNN
meaning
placed
input
assumed
knowledge
expert
with
the
be
for this
necessary
are
in
transparently
input
and widths
centers
could
natural, with
from
the data
the
cover
more
weights
since
when
to
limit
always
algorithms
connections
The
are
do not
always
weight be less
adaptation phase
variables.
perform i‘uzzii‘ication
Simple (see
An
important aspect
this
is that
layer
membership functions.
The
allows
types of inputs
for very different
may be divided
hot, while be
can
ln
into
the
holiday (which
each
layer in
expandable
that
gain coefficient:
g
function).
the
For
thresholding activation
values
The
from
in
connection
be within
a
to
of This
simple example, temperature
a
whether
it is
from
range
cold
public holiday
a
or
to
not)
no.
of
0.9
to
to
condition
certain
values
large 2.19722
would
the net
ofa of
node
an
interval,
will
input
condition
element
5
with
the
to
rule
node
between
rule.
a
hard
--I and
or
variable
activation limited
would
provide
+I.
the
degree to
However
when
layer (also
semantically
thus
a
are
fuzzy
be remembered
the rule
[~l,l],
that
adapts
sigmoidal
close
is that it represents the
and from
to
it
potentially
These
fuzziness.
of this
say
network
standard
make
values
associated
must
the
ensure
for the activation
weights
the
as
giving
1 is
layer represent
elements
is also
used
component
rule
rules
more
layer
of
when
the
The
sigmoidal logistic function
the activation
from
rule.
is the
fuzzy-neural architecture
the
represent
part of the architectures
as
of the connection
a
value
value
0.1
weights
layer)
values
A
the antecedent
corresponding
The
membership functions.
representing the
indicate
numbers
differing
function
coefficient
gain
meaning of
of rules
functions
to
As
together.
single fuzzy
a
be added
can
(a default
may be desirable
semantic
represents
The activation
function.
data matches
nature
variable
have
can
the output
to
membership functions
binary
a
node
nodes
shrinkable.
potentially
The
is
be used
to
inputs
Layer
rule
values
principle applies
same
different
seven
different
represented. using _justtwo for yes and
2.2 Rule
to
of
which
the
synergistic
interpreting such called
degrees
of
the
input
rules.
membership
importance
of
the
node.
layer can
be limited
introducing non~linearity
during training into
the
synaptic
This
weights.
implemented
mimics
option
in accordance
the
example,
if
described
above.
interval The
with is
biologically plausible phenomenon [12]
a
[-l,l|
suitable
a
limitation
weight
would
l,l]
(since the input membership functions
only
allow
rules
and
the rules
output values
to
saturation
weight
interpretation
with
that
high values,
are
very
limiting option,
A
bias
2.3
the mic
to
for the
not
such
ensure
that
to
form
some
normalisation
nodes
is
rule
a
factor
gain
0 and
further
For
be 2.19722
may
the rules remain
enhances of rule
interpret a
the
the
normalisation.
[-
would
of the
meaning
has
as
within
problems
that
be
function.
1) and the gain factor
example
of
should
of
rule
input Weights
With
this
weight
is removed.
bias
A
optional. to
the activation
inputs into
an
it is difficult
for
thc
This As
occur.
input signal
net
for
[().l,O.9].
16, without
say
value
all between
are
in
weights,
necessity for
connection
would
weight
be
interpreted
as
a
neuron.
Output Layers
In the action
an
unrestricted
the
shifting parameter
will
factor
appropriate gain
an
but
in
the
membership which
the
facts.
The
is
represents
connections
from
factors
constraints
of the that
to
for
previous layer for
the
nodes
with
rule
of
the
of this
them
same
layer
the
layer
to
to
node
used
the action
remain
in
advantages of is the
label
fuzzy quantisation
for the output
for
the
variable
degree
recall.
is cut
element
specified semantic
to
space of
required which
this
So this is the level
according to
to
the current
layer represent conceptually
inferring fuzzy output
sigmoidal logistic
6
the
represents
data
rules when
from
large
or
fuzzy linguistic
corresponding
require
label
fuzzy
the current
for this the
a
medium ,
activation
supported by
membership function
subject
function
The
velocity.
function
the confidence
the
node
a
variable, for example srnall ,
output
change
L f(fl?’l(!I lf layer
intervals
or
values.
specified ordering
inte-iprctahility. function
with
’They are
the
The
same
as
activation
(variable)
gain factor
given
The
the size o’l‘ the
variables, functions
an
values
adjusted appropriately
modified
a
of
center
the connections
from
defuzzification.
gravity
the action
it
as
the
was
the output
to
Singletons, of
case
Linear
layer.
medium
and
large
be
can
represented
connection
as
the
input
activation
weights having
O, 0.5 and 1.0 respectively from the output range ol [(),]] if normalised the output
Adapting
that the
labels
fuzzy
constraining
rule
always
(partitioning centers
or
which
is
up to one,
sum
satisfied.
is used
output variables
than
can
in
value
the
same
have
dii t‘erent
as
way
of
the is
a
input
be used
can
numbers
there the
are
but
belongs to
For each centre,
output variable
one
outputs
the centers,
moving
mean
particular output
always
More
are.
would
a
This
ordering).
restrictions
and the different
structure
functions
menibersliip
rnembership degrees to
must
membership function FUNN
be
used here.
requirement
various
should
triangular membership functions,
to
example, small,
considered. the
of
attached
are
of
gain factor
weight boundary.
centers
are
this
previous layer. Again,
layer performs
output
representing
As
in the
as
in
a
memhersliip
functions.
One
of the
system
advantages of
without
having
functions,
linear
algorithm,
the
regarding
such
FUNN/2
the
and main
the FLINN
unnecessarily extend
to
sigmoidal,
are
departure being
networks
architecture
is still
modifications
are
used the
the
along
much
For
Adaptation
traditional
with
a
those
of FuNN
7
MLP.
slightly rnueh
results
simpler given
algorithm.
3.
it manages
constraining rules,
applicable. made
is that
not
to
provide Since
modified of the
a
fuzzy logic
standard
transfer
bacle-propagation large body
ol
theory
immediately applicable
FnNN/2‘s
natural
structure
for
and
There
three versions
are
These
adaptation [3,4,5]. environment
same
adaptation
(a)
for
where
the
the
do
by
This
subject to
constraints
algorithm
These
useful
This
on
to
the mode
according to
versions
be switched
can
where
are
several
Extended
suited
some
with
but
between
all
are
of
training
and
provided
Within
the
These
methods
of
needed.
as
for
version
with
does not
is
different
some
known
are
as
use
{(),l]
be suitable
can
in advance
where
or
the
for
systems
implementati_on
be made
ofa
to both
rules
and
in
membership functions,
meaning.
genetic algorithm
for
The
limitations
limit
explained
as
constraining
used, but
not
the
adapting
the
are
membership only partially
intermediate
limits.
The
5.
such
in
an
part of the
Baekpropagation
backpropagation algo1 itl’irn is
backpropagation algorithm,
the rules.
appropriate modes
modified
a
adaptation mode
extended
the
in section
most
This
retaining semantic
the
alternatives
to
and
(MF) of the input and the
way.
an
alter
functions
membership
changes to
necessary
combination
use
in
version, with
not
the
be used
to
allows
used is described
algorithm best
4.
version
mode
the
modes
whatever
exclusive
mutually
adaptation.
version
partially adaptive
imposed
rule
problem
fully adaptive
functions.
the FUNN
change during training
of
the
A.
a
version
membership functions
appendix
(c)
not
are
the versions
not
purpose
is constrained
a
weight updating in
are:
variables
output
(b)
and
partially adaptive
a
used
of
to
for
be chosen the
iterative
between, but
given problem manner,
with
at
a
can
be used
certain
each version
time.
ol the
together
in
It may
be
adaptation
adaptation task.
Algorithm
8
for
Adaptive Training
of I¥‘uNN/2
the fuzzification
Here
simple
on
and must
The
and intuitive
satisfy
change ofthe variable of
of
with
5.
(either input
x
the
training changes
of
of
algorithms (GA s) ot natural
Given
find
to
functions,
solution
space been
alterzition.
this
The
positions
this
of
that
change
problem in
application
is
ai
to
able
natural
[l(.)}, and has
of these
function
to
been
principles
to
of the rules.
that
are
a
for
FLINN
the
on
the
on
success
search
quickly investigate large
may involve
shown
based
problems.
effect
may
candidate
amount
In the normal
LQE. By iteiiitively building! are
The
in FuNN
conibinatorinl
one
and out
nlgorithins
fuzzy niembership functions a
adaptation.
ofa
small, gradual movements,
to
Weights into
of MF
the
membership functions
after
limited
ealctilzite
to
the concept involved.
are
the
used
are
the initial
Search
GA s
difficult
to
formulas
a
to
can
GA
be be
many
changes to
many
others, the large possible based
effective
more
seen
This
approach. than
logical
as
a
to
the
has
manual
extension
of
work.
carried
of
and
investigated
The
previous
work
for
with
selection
problem solution,
a
optimisation
different
functions
effective
genetics and natural at
2 shows
Adaptation
highly
are
approximate solutions
that the
already
for
different
demonstrate
to
membership
to
Algorithm
previous attempts
spaces
order
based
concepts represented by the layers
membership functions
weight changes occurring
Using Genetic
principles
layers, but
and the
output)
or
exaggerated in
been
majority
Genetic
for the two
the
connections
membership functions.
the
imposed on
Weights (see appendix A). Iiigure
change has
course
the constraints
layer change their input
changes reflect
These
formulas.
principles apply
same
and the defuzzification
layer
the
out
in
[10] focused
membership
functions
the
on
of
use
the
9
of
small
system.
changes ’l‘hese
delta
values
Width
may
and
be
centre
easily
Membership degree
Ak
Small ..
Medium
..._
Large
._
lf*
/
...
.
_
’
¢
.
l
l
|
\
I .
Z
»
\
,
-
......-...f
-f
1
»
\
.
-
l
Y-
ie
-f
Figure
it
can
move
1 W
--,fV
,
_,
I
~.
,Q-_fi
2.
Initial in
a
membership functions FuNN
rigid partitioning but not cross,
and the is used
_1
1 ........,
are
21
(solid lines)
constraining
also indicated.
,-_.:;:ff-f-~
,
Ci
membership functions as
X f)
. _
Cm
represented case
_
.
C
»= Q »’ ___ J e1_Q.D_wQ¥_;_;-;z;_V;¢=_3§ $1 ,__ fir; ,_¢ if; __._..,,_,_____%§§ ,_,,.,eH_,, a./2.1: :fest _T ii, an L _--;-_-,__,iV:;.~§’="__ »Z_q. .;§=,;;»j5_~:§fi’ ¢ M Eff _fi ‘ 1" W;2:¢ &§f»~ 3 ’Y _,,» ,J i PP, ,M iT0umame;\tf¥§_§f5~%~5 95; in ;_vj;5_"_:-;m{y§’;i:_ 3 ’1_Ul:£m , W I _J W, , ,T 9fnml-Fin " .J fy .Q Af;}_¢?,Kg3-t;;¢r:!=*53,.s-;.Pg f _; ‘yS 3; +¢,,__ V-x’;_> -5 ft! & 1* 3_5 ,1;‘§:z_§~=~,,~ f;5/.f’;_ ;‘~» ,,;,\_¢ :f 4.» 221.1-33g,»-.~_;¢ P; "§E_¢i1~ 1
._
rmmumcs X
.
\
*
»
x
EV,35’ _
_
_
_
I
_
_
,_
.
_,
adaptation
__
1
V=-
__
_
_
____
’
V
»-
l
f
-
-
_
-V
»-
.
ii
"
._
-’
,»,;, - -4
_
_
-Q
~»»f9&__ =
-__.,
W-N 2; }»={ _"__, /5 M;-w»iu= ‹.=~r,-.-. *P §§l_;;; »:2f’Gf»;--§ 22;z."§3§3’Q.7’,’;’~ l‘§_}- If l‘§_}12f _-__ {Q: ff ;_szu~§@-’1"~24~* -5 ;§f%;’:wf Q: 12 If ff4*"7$§-§= i55"£f-*1~=‘~*f;f ~_ 4*"7$§-§= ;_szu~§@-’1"~24~* -5 ;§f%;’:wf -8 -~¥:7nfl W¢-1=_ *Pnfli55"£f-*1~=‘~*f;f -
~
».~,.W_
~
~
,__
¢
Wx,
_,__
_
_
__
__
»
n
_
¢
s.
- »?%=?»¢~.@ --~~~ ,V~=»fJ,V~=»fJ ~f-~r~>_r~---~~~~f-~r~>_r~---~~~ ;,. =.>w»V»»».~-.f§».ii3ff>;-= V~- »?%=?»¢~.@;,. f
»,V
V~
~
"’
’_ ; ’» pggy, ~,»g-,,e.’_"’_; _-§5»\f;g_.\K;a,"~,,,_§2.,;f,~§¢=;Q*_"wen _-§5»\f;g_.\K;a,"~ ,,,_§2.,;f,~§¢=; Q ~_ :»:>a_;/¥~», >v~;_ rw¢¢ -~_fefggijwr.~»:~»§~,?J;iw¢ _ »»~. w@,»w¢-f§ezmv=@; :’1fQ§_§_ wr. iii’ ii.;~»:~»§~,?J;iw¢ iii’ »»~. rf- ~»» fif=¢=~:=f;_»?i¥‘§~_ *
_
~
_
"wen
-~
fefggij 4:2 m v 4; >_w*if ’’Lfm;*ifgd_fif=¢=~:=f;_»?i¥‘§~_ rf-~»»gd ’L fm;
__
e
_
-=
"
‘
’ _
_
;~__v¢».
.
_
_ ____
-~"’
~~»
~
;
m_;_,@>
~_
_
_
’-
V
».__,
’
_g
____
__
< -»¢ *£=,=~>.M_¢¢m--. »_~rf#~~».»4??f: 1ef¥~_>»_f* ~:_~ Aa.; -fy V->-’->1~-»~5~2§¥§‘_ gig 5 ii _ "e ._-__Q»».__,-V»--= _-Q -__.,~»»f9&__=W-N- ,__~}»=-_ ____-____2;_"__, .»4??f: /5 » M;-w»iu= 1ef¥~_>»_f* ‹.=~r,-.-._ns.¢ ~:_~ ’"f -..- -»V»1"W" f.-~_;=~.~¢f2_=lfV_= -’vig-_~Q,;;;"’;;#;~-\~_-.ax-,=_g>.»4??f: 1ef¥~_>»_f* ~:_~ »V» ‹:_1-_1.>-~,f~.~ws’"x~wfri?fm;-f*-xnf "W" -Y =_g>.»4??f: 1ef¥~_>»_f* ~:_~ 1 -_ .>-~,f~.~ws’"x~wfri?fm;-f*-xnf f ~_;=~.~¢f2_=lfV_= fV_= ‹:_ ‹:_1-_ .>-~,f~.~ws’"x~wfri?fm;-f*-xnf fri? fm;-f*-xnf ii _ "e ._-__Q»».__,-V» .>-~,f~.~ws’"x~w
~
Y§ 1_,_
_Q
:_-ff;
»
-,V
,
’
’
‘
¢-.__f __=
Q
UI
1
fn.
»
1
X
’
,T
_V
~
»
-_
_Vx
&
.2 ’~ f~&»s=: "._~f_~.A »§V»"’4ff%;»‘.2’~ f~&»s=: --V:¥-;_._’.;--V:¥-;_._’.;-"._~f_~.A’--w1:~e»;2~ff/>ØØ»"*~2»;’ -w1:~e»;2~ff/>ØØ»"*~2»;’ f ’z=‘-~"’;~¢s§-2-’~ ~=r~-\.__-’_V_==>-5=’&-;~;:_::_ uni ;~j.C-L_.,-f:*’r";--.¢;M_ f ’z=‘ ;~¢s§-2-’~ ~=r~-\.___==>-5=’&-;~;:_::_ uni ;~j.C-L_.,-f:*’r";--.¢;M_ _ uni ;~j.C-L_.,-f:*’r";--.¢;M_ W.
’-~
iii?-§‘$
,;,
3
of the GA
..__»>\~_¢¢_.»_,, _;.._,§ ‹f~.§ »¢ ;;A_,
0.0106
’
___
.E Af i ,gf f~ §_ __%,>;W,:4_ L, QU __ 1 ;»~e.%-;_ 7 Evaluatnon I* ai fm ¢ r= _V 2 ff; Ø dk -_ *fi* " ,V,,f?< fw~ X _ __ Vw»’ ._~;_V=-:»_ .».x /¢»§ »¥ va »1-~»Vw »ff»;i»;_ :§{"Q" S’!i >*9, 4_.»._, Q); 3. jfs* . §»» 1 " »~ » _ ,M Q "-.I-:1* »_ ng’; f,T 4’ =~@w~ ,n f~=¢>>:Zf ww. ¢ /"_: F 5 1’ 5 "2< Y _ _Q M _ ;» ‘_f?f‘_5>§0 i? ‹I\ I¢_f f, ; _»~2 " §y= i ‹f \; *W* , _Q 1- Q _;/ »v,3 M’ »¢ 5 _:gy ~,g,;f_»; Q *, ;_ if MV f _.;;_~»x5$_f);__m gff-Mb? 1 5 gkwg »f;;5f~, -.;. 2 _-f-.¢_____ ;~ V_.¢._ Q MQ1,,fu »:. -_.§:fw;Vf ,_ ,HA _g>.;-.-.;~,fu,_’;,_:-;~’.,=,,’,___::_» >.;-.-.;~ ,_’;,_:-;~’my _‘T’5§_j3L, r A .,=,,’,__ _::_» ,
__
._
’
__
f f‘:»v,b¢,.,§»Q
y
»
-V
4"
F514
_..__»
~
’
’
F
-_
’1
-
0.0103
ifin ’,§§Ws;$Iec1
»
»
X
.
3 The interface
d
Q
.a
‘
5 »*_ -;s*$’~¢ *§:_
‘
_
\f,
1
7
.X
¢
l
W
;" T -J?" 7- /§ ¢;~Nl P 5 n%i_Øf_¥f¢_0u U tV»:;:f1»?>2¢~: f ‹’11‘_ ’/ 1"? Z 2" ,Q U ;a";:»"¢&»£af; =¢ Y! Q=’§f :’Z@&--if’ * »’=‘3"’ U any J :gf;»f:»:-=’s’~f~3 5 .QQ ’;»-\ f _,;i_e;;_~’i:1f_§_; ,ff
li:
’~
~
=
Q !___EP1eS§_NQ"P@l’¥?!!Q% /_
~
’
1
"
‘
)
,wi ¥¢-’ET;** ._.».1§i ~{_,,,§£ QU _fa___eeV»_’
‘
~
>, ..__j x
1
=_,
f ’
‘
,M
4
4
M
37’
-Eff; Fw
1’
f
W
"
3
""_H
5
.
.
"»
~>~ V
;.@;;_f£_._ 2,4 x
=
Y
1%
P
»-
_
if J
=»
zi ¢»
_._-
f* L_., ~»
_
"~
,.~
f
__
*f§§ ‹fX ‹ff
I
»
ww
#1
Ni __
~_
f
1
i
_
9,
-"T _Y ’~ Wx
"
11 4
.
_
.,
fr
’I
Vw.-»~v.~¢m
Bequiremenr u num
if
’
_
,
f
\
1’
_
_
;f"_
.1
I
,,
_
~
-1
.
.
z
"" "" ’
1 5 50 VQM; _f.;.._-_ _
»
»
/
2*
_
_
X
_
r.of Generation _f’NUl"f’|b|? 2
1
,
§
\
~§’§¢~)§¢;,. W..-_~_ ,.»V14-f-_’-"1u »¢~=~V=¢?~ >~§’§¢~)§¢;,.
,.»V14-f-_’-"1u
V14-f-_’-"1u ,.~ MQ*-1»=i@MQ*-1»=i@ " 3? @i§§¢ ?¢1z?;~?; 2%ffMQ-E2_;_2¢§,.~ MQ*-1»=i@ -
_
__
-
--
_
_
14-f-_’-"1u
»
0,0104
_
_
_
’fir _:»xØ?;;:-;~:-:_ fes¥?§:a2*4?§f§f) >~f.-,_ ;;:-;~:-:_ 323. 323. 1 §5:A3;~¢»’¥1; §5:A3;~¢»’¥1;fes¥?§:a2*4?§f§f) >~f.-,_ §.~:1%-;?*§.;: >E§$.;;$7§.~§-1=_:.x_-_>.._._tW57 ~’fir_:»x 7 5s,_Y>,¢§].,»%~Y¢;,{§’_;¥.t§¢§1;§§)_ M-__»~ _:-:,.-- ___; M-__»~ §§’_;,_’;§jj»;a,v£~-~;:->.;~~,,¢;;;,.-A517.;____;_£:1/@;~,,_;; _:-: §§rxy;:1 ‹,;s;¢>,~_M, JJ.- ~g.¢_2@,_’V,,»,‘_-___:HT V‹,;s;¢>,~_M, _-____~ JJ.».~g.¢_2@,_’V,,»,‘_-___:HT __ 1’ -§,_‘w=_:.x_-_>.._._t ==$3§§*fs:~.;f$1-¢§>;>v,§5§§; SE§‘f4__VfG’§~_=,¢sz»’ Vfa+i.=.s.m_¥-¢1*=;§.A-’»f ?*§.;: >E§$.;;$7§.~§-1 >.;~~,,¢; ___; §§’_;,_’;§jj»;a,v£~-~;: ;;,.-A517.;____;_£:1/@;~,,_;; A517 §.~:1%-;?*§.;: >E§$.;;$7§.~§-1=_:.x_-_>.._._tW57 V _-____~’fir_:»xØ?;;:-;~:-:_ 3 ~g.¢_2@,_’V, _;_£:1/@;~,,_;; ‹,;s;¢>,~_ JJ. »,‘_-__ ». ~ -§,_‘ *»b»>s:~.;f$1-¢§>;>v,§5§§; SE§‘f4__VfG’§~_=,¢sz»’ Vfa+i.=.s.m_¥-¢1*=;§.A-’»f ==$3§§*f< »>s:~.;f$1-¢§>;>v,§5§§; SE§‘f4__VfG’§~_=,¢sz»’ ¥-¢1*=;§.A-’»f Vfa+i.=.s.m_ "QQ ~
_-x_
1’
110103
_
,
.--
-
.-
.;_
M,
__
-
W
5
V
,
w
_- ____-
_:HT
_
_
A- ’»f
.’§1~2¥~1-¢§~,¢¢f§;~P§ ¢-~_ Aa, ;§’V"§;V1’f:-:"2_;»:V- §e‘Q__3".Xn;_:_;;;, ~V¢=,,,-§§:Y;; $13 ’_:‘}»V ,e§i;’£ ;§g2¥.;;?;:_f-_;§__:;%f7~;Z£’_\,_"7_~;:_.»;;¢;v _f-_;§__:;%f7~;Z£’_\,_"7_~;:_.»;;¢;v Z’_.(§§ f;’:_¢_;u;f1;1-~_-L f;’:_¢_;u;f "7 _~;:_.»;;¢;v Z’_.(§§ f;’:_¢_;u;f1;1-~_-LW"§,V$ ‹’v >-=’~--f§,=.§¥"rw"B;;2»_Z;fx-if.:‘>f~ §’.%QMKQaw »\ §i;’£ 5‹’v 9-=$,\-:~ ;>,\-:~ ‹’v >-=’~-- ¥=f¥i§§?~*;» _Ø:;#QL)z4i’,§.?.k-@;’r4f_ ’:_¢_;u;f1;1-~ "§,V$ ; ¢_ fe; ‹’v _.»;;¢;v Z’_.(§§ Z’ f;’:_¢_;u;f1;1-~_-LW"§,V$ >-=’~--f§,=.§¥"rw"B;;2»_Z;fx-if.:‘>f~ "B ;;2»_Z;fx-if.:‘>f~ »\;§ 5 9-=$,\-:~ f;’:_¢_;u;f1;1-~_-LW"§,V$ ;¢_fe;;>,\-:~ >-=’~--f§,=.§¥"rw"B;;2»_Z;fx-if.:‘>f~ "rwf’".f¢1’13’ Z;§’.%QMKQaw fx-if.: ‘>f~ »\’f §i;’£ 5 9-=$m;7»-= t"’¢2:.’>I~"*’f_».a_If". ¥ 4v>m;7»-= L’BE§ r’:»~»._Z’¢=;»’ §§¢"=£< "’5»,% 1K:-’-=F 4v>m;7»-= t"’¢2- K:-’-= ¥t"’¢2’24 F :.’>I~"*’f_».a_If". ;,;\2":_m~’=4&-’’-’24 L’BE§ r’:»~»._Z’¢=;»’ §§¢"=£< "’5»,% 1K:-’-=F t"’¢2¥ s~’1:=_V:§2eT~ L’BE§ r’:»~»._Z’¢=;»’ §§¢"=£< "’5»,% 1K:-’-=F 4v>m;7»-= -= t"’¢2-:.’>I~"*’f_».a_If". §§¢"=£< 4v>m;7» -.;-’_ ->5jg»;.-.~»ff?§21~@¢7?’§f: I=$»».f »§~1ai:>=:*~2¢fQz@~>5Mf~--a%¢if;~;a2¢§2~~¢§?§@m*=nq~;’4" _
00102
’f
_
’
’
’-
»
_
’
.
(10101
_
__
._
,
._
_.
._
_
, .
_
,
__
._
._
.
._
,__, ._
_
___
_____
_
_
__
_
_
._
:
Figure
_
"___,
______i_
_
,W
*
,___
_
_
**
__;___P_,i____
,_r_¥
5
L, _;__.. =
Generation
\
\
_
=
_
_
4 The
decrease
epochs of training
with
of
RMS
FUNN/2
with in
the
increase
of
mode t‘uil~adaptz1tion
generations
for
phoncmc
/el
after
100
chromosome
individual s
error,
since
errors.
An
with
individuals
those
of
example
in
English phoneme /e/) is given
the
most
fit
is selected
population optional
mutation
reproduction of
6.
phase,
a
values
new
Mean
calculated
are
errors
a
FuNN/2
and
with
encoded
original FuNN._A the Root
the
as
fit
more
each
within
the
defined
user
test
Square (RMS) inverse
on
error
of the RMS
those
than
system trained
tournament
or
population
the
nest
with
higher
speech data (the
One
(which
point
a
breeding
Finally, is used
crossover
that
ensures
population),
selection.
wheel
roulette
is created.
elitism
optional in
modification
without
either
using
of
values
delta
is present,
f,ig.4.
retained
is
network
RMS
adaptation
the fitness
optionally norrnalising
After
is then
FuNN
the lower
GA
using
and
FuNN,
FuNN
the current
the
of the
copy
modified
of the modified
fitness
The
calculated.
the
to
the
to
in which
individual,
an
applied
are
submitted
is then
file
data
population,
evaluate
To
is evaluated.
individual
of the
initialisation
the random
After
after
an
for
the
chromosomes.
Different
with
Experimenting
Training
and
Adaptation Strategies
for
for
MS
FuNN/2
Fig.5
shows
Windows
interface
the
version
of
of FLINN/2,
l ‘uzzyCOPE/2, has
been made
the
FuNN/2
which
available
implementation
is part
of
an
free from
UND(
environrnent
integrated hybrid development the WWW
A
tool
called
site:
http://divcorn.otago.ae_nzz800/COM/INFOSCI/KIZL/fuzzycop_htm
l3uNN/2
suitable
discussed
allows
for
is selected
different
for
a
certain
training
and
application,
adaptation strategies Seine
below.
ll
ofthe
to
be tested
issues involved
before
in this
the
most
adaptation
are
»f_.
»=
’
f
_.
~
y
~_
_»»:_~--»~< "jj~’""~ »¢ :_-==:=-_’.>=f;¢;j;.»’Q~»~-~ j’_2_;;,
’Q
_ ~-
A
;
M
_
.~
iv~1_@1;.@ Raw mam le01 ~*jj_ ~’j_f;- ._ _@_’,%;»*"* ""’f"’.?"§'’?.§‘f5.’ l’’"f=" 4 ?2 ~ .Ill _"’.Ill‹lf§Ø§? -:"’?’4 T T 1¥ 1¥T P_¢ 2f2f M9r _(HuwI01 _"’.?§§.f"?" ‹lf§Ø§? ?§§.f"?" -:"’?’4 Tle9$iw2lplvltelI¢§ Tle9$iw2 P_¢ M9r le9$iw2lplvltelI¢§ _(HuwI01 M9r _»»:_~--»~< "jj~’""~ »¢ :_-==:=-_’.>=f;¢;j;.»’Q~»~-~ j’_2_;;, lplvltel I¢§ j’_2_;;, _(HuwI01 _(Huw I01 "jj~’""~ »¢ :_ -==:=-_’.>=f;¢;j;.» .l §~ ‹f;fz¢»§%¢G=»1’? T’ "~*fl4=lT_(l\¢¢~@l1v’v)]l. ;§;~;?fØ>-Qfrif-iiiG=~l’1F1l=lf2\"£l"’1 = ’l_|..___]"" 4
_»
_
- f. ’
._
iw.;-_ .;=_~._,1 iwuunuym wa-l’;~ __1f_;~.< _.#561 -_mymg-@zi§2~1§_:§§~ Layer:f. ;¥ f;2;ii;.§?’W@ouzpu¢myena >2_,~ wif ,Ml
,»
_;:_3 ‘,~z» vzj.:::»f;4_,f -=,§.=i.=._\
f i*
~-
.=>
»~-~
~
_
">~:E1fffff;_ -~Q
=
=;
ff*
‘ _
~_i:iA;1_E;7%1;;‘FuQsrlg lvfFs_by MPS -’l__; _-Tuainiiygjxi/ith frixzgm ._
interface
Y_;;»L
--
__
»
_
’~;.¢
l
‘
_,§Q:__._;__ »;
"
if
/ij
-~
~.
,
Initiailisation:
variables,
and
variables,
These
Membership inilialise
to
Rule
being
initiztlised
set
rules
insertion
inserting
the existence be
the PuNN
can
the four
The
fuzzificntion
change
be
in which
in the latter
ti
GA
Since
functions
the best network
training
parzinieters.
The
rules
it is used
rules
are
for
values
the
input output
information.
then
this
be used
can
exists
knowledge
for
initinlisation
with
the
of the FUNN
represented as weights
importance
of that
for the inner
two
but does not the system
ease
between
layers
these
are
adapt
the
adapts both two
frozen
Wit.h
algorithm.
rules
will
can
need
then
of the FUNN/2
the FuNN/2
For the
for the
rule
and its
so
as
well
as
sensitivity
to
options
rule
weight layers, in
membership functions, the rules
is that
in the former
and the
and
or
for
membership
the connections
case
which
they
in the
are
subject
case.
change.
When
initial
those for which
is available,
mode.
and dcfuzzifieation
membership version
as
values
method.
accomplished either
difference
only
the
initialise
the default
adapts its fuzzy
Adaptation through
be used
used in l.he absence of other
rule, the relative
at
can
initial
as
expert knowledge is available
least
of rules
be used
can
provided.
weight layers,
functions.
to
of
vz11"iz-zblescan
the system
using
are
some
at
or
If initial
through
Training
If
memberships,
singletons
that
insertion.
structure
case
0
function
insertion.
input
9
the defaults
are
MF
triangular
distributed
uniformly
the
remained
distributed
uniformly
the
option only
small
some
be
this
subject
to
ttdjustrnents ti
short
period
the
membership
for
of
the
functions
membership
new
training by
the restricted
baeltpropagtttion ulggoritlirn.
opportunity
arises
for
brick-propagationtraining option
12
selecting several these
include
training
gain
factors
and
ztdziptzttion
for each
layer,
learning connection
the
and
rates
weights
parameters
main
of
fuzzy
rule
Since
FuNN
should
the
too
network
to
learn
important
for
reasons
aspect of the FLINN
The
by placing when
to
momentum
some
the initial
dealing
some
terms
with
of
data
small,
are
usually into
the
two
of
sets
rule
genetic algorithm training fitness
optional
rate,
used.
These
system.
standard, feed-forward
neural
either
are
networks
and
in
is not
to
to
for
[4]),
data
or
for
are
becomes
circumstances
new
of this
data segment two
membership functions The
a
FUNN
regarded
lower
values
is
point
global minima,
also be
can
to
a
for Ful\lN‘ and
than
speeds
since for
to
not
local
some
important
an
sets.
corresponding
problematic
is
important
an
learning, through avoiding
compared as
first
below, but the second
of
the
and
accurately.
more
be discussed
closer
ol rules
necessarily fully-representative, data
used
sct
faster
insertion
optimal training periods
potentially, new
mechanism
FuNN
a
greater accuracy
not
This
degree.
investigations
new
using
the
mutation
generations,
the initial
for
weights
weights
gencralisation capabilities
When
the
both
This
system.
greater number
training
with
ease
implements a fuzzy system,
regarded as
feature
ol
For
for
interpretation and adaptation.
allow
minima
for
boundaries
sections.
previous
number
arise
applications it is the
approximation (as
and
layer,
and the type of selection
optional elitism,
areas
each
in the
explained
population size,
are
normalisation,
Two
as
for
rates
momentum
the
MLP
will
learning
traditional maximise
and
rates
MLPs.
the
slow
Still,
networl
necessary.
available and
decisions
must
be
made
relationships expressed in
general learning strategies can
13
be
regarding
this data.
how
to
Depending
distinguished
adapt on
and used
to,
the size
[l3}:
aggressive
0
a
--
any of the old,
Q
(:¢m.s’er-valiw:
entire
section
of
data
new
-
is added
proportion
of old data
efficient.
The
length
a
An
with
using
large
of time
relationships.
as
of the old
that
data
example
occasional
7. Rule
the
new
with
data
and
training
in
training
here is that
departures from market
Insertion
without
using
is
performed
the
on
data to
with
a
produce
percentage
a
of
fuzzy
are
retraining old
data
since
data set.
tends
be
to
feasible),
persist
for
trend.
ofthe
stationarity
until
of the stock that
the
returning,
market.
This
However
il
most
for on-line
relationships, and all,
at
tends
In fact,
be
to
depends on operating requirements (since
some
to
the
exhibit
to
except in unusual
original
long
term
circumstances,
crash, the long-term pattern will eventually be restored..
Initialise
to
conservatism
new
data
may not
set
changes tend
large financial
a
aggressiveness and
of old data retained
amount
adaptation using
such
all
to
be combined
can
of
compromise
trends
adaptation
set.
these concepts of
the
and
training
data.
previously used,
Obviously,
some
data is used for further
new
FuNN
and
Rule
Extraction
from
Trained
FUNN
Several
different
methods
for
rules
extraction
are
them have been
implemented and investigated on
is based
simple idea
weights, extracted
confidence
on
the
which
are
fuzzy
rules
factors.
each of them
above
of
their
One rule node
representing a
test
cases.
thresholding connection
the threshold
with
applicable
are
associated
the PUNN
The first one,
architecture. called
weights according to
represented as condition,
or
a
ofthe
input
I4
is
condition
represented by elements
Two
of
I
4;],
REFuNN
preset
action
weights representing degrees
in the FuNl\E architecture
combination
to
of
elements
The
in the
importance
several
which
limit.
fuzzy
would
and
rules
activate
that
node.
and
interpreted later
also
A
simplified
at
classical
in FuNN/2,
implemented [~"I‘,’I‘] for
interval
in
given
a
ol the rules
version
is
the connection
procedure
weights when
is useful
also be extracted
procedure called
A
engine [3,4].
keeping
T. This
threshold
degreesof importance can
inference
fuzzy
which
without
which
used
is
zeroingf beyond
are
and
part of the whole
as
training procedure [13].
A second
Each
implemented. condition
element
condition
element
condition
node
nodes,
has
consequent elements
rule
with
node
in
given
to
for
input
an
been
in
a
implemented
the number
that
class node.
6.
Fig.7
An
of
shows
the
same
of
in
a
node, along with
FuNN/2.
set
in the action
rules
ol
in
attached
One
defined
The
by
rule
element
a
simple
has
in
which
’front
a
of
weights
corresponding weights only
general
weights from
is
ns
action
layer, each
extracted
format
the
to
is also
neighbouring
connection
the connection
weig/’tied rules
the
connection
these
structure
rules
strongest connection
The
importance
Ful\lI\l
of the nodes
example
aggregatedfuzzy
corresponding rule.
degrees
as
keep
to
the
rule.
rule
the
to
ol‘
tenns
fuzzy
one
variable
represented in
are
in
structure
certainty degree (confidence factor)
a
fig.
as
l*uNN
representedas
In order
elements.
a
node is
interpreted
are
masking procedure
interred
interpreting
rule
connections
these
for
algorithm
a
a
many
(class)
from
this
FLINN/2
appropriate
is
for
explanation purposes.
The
extracted
insertion
rules
This
mode.
connectionist
and
a
fuzzy
and
neural
implementation
a
Ful\lN
makes
fuzzy
S. Conclusions
Here
from
be il lS(3fl‘Ø?(fin other
possible using
inference
modules
directions
network
FuNl\1 /2
can
_FUNN/2
it
hybrid
modules
environment
through along
the rules
with
other
[l l]_
for
further
architecture
called
is introduced.
in
Ful\lN/2
The
research
FLINN
[3,4,5]
is further
adaptive learning algorithms
I5
developed used
along
and
with
its
the
N.-.M
...
_-WC
_
Cs
RULES
if
.
_"__s
___,
is A 2_88> and isnotB5.69>andisnotAl.57>andisB is not B 5.69> and isA2_88>andisnotB5.69>andisnotAl.57>andisB is not A l.57> isnotAl.57>andisB
l.66>
and
is not C I.O6> then isnotA1221>andnotB3.98>andC2.08> isnotCI.O6>thenisnotA1221>andnotB3.98>andC2.08> is not A 1221> and notB3.98>andC2.08> not B 3.98> and
isB
is B
C2.08> C 2.08>
else
if
is not A 6.2’7909> and isB2.’75163>andisnotC0123>and is B 2.’75163> and isnotA6.2’7909>andisB2.’75163>andisnotC0123>and is not C 0123> isnotC0123>and
is
not
B 2.85492>
2.59 l63>
Figure
and
is C 4.9954-4> and isnotD7_4l543>thenisA is not D 7_4l543> isC4.9954-4>andisnotD7_4l543>thenisA
and B1.22>andnotC4.8-4>z1ndnotD4011> B 1.22> and notC4.8-4>z1ndnotD4011> not C 4.8-4>z1nd notD4011> not D 4011>
6 TWG
weighted exemplzir
mics extracted
from
a
masked FuNN/2
A
EJIJIETSWC In
if
is A > and isB>thenisC>else is B > then isC>else isA>andisB>thenisC>else
if
is B > and isC>thenisA> is C> isB>andisC>thenisA>
Figure
7 Two
then
isA>
simple exemplar fuzzy mles
is C
is A
>
else
>
CXIFLICICL1from
a
nzczskerl PUNN/2
then
and
isA
is A
and rule
rule extraction
and
modelling
include
areas
that this
techniquesshow
information
adaptive intelligent application
insertion
is
a
which
processing systems
promising approach to building
suits
time-series
signal processing (particularly speech recognition), data
prediction, adaptive control,
mining
and
These
applications.
many
knowledge acquisition,
and
image
processing.
Further
development through
structures
alternative
of
PUNN
alternative
choice,
planned
ol GA
use
radial
membership functions;
is
basis
and
in
learning
of
chaotic
directions:
following with
functions
membership
development
the
forgetting with
a
FuNN»based
and
pruning; adding, as
larger overlap
be-tween
systems; using modular
systems for adaptive speech recognition, adaptive control, data mining and time~series
image
and
object recognition
discussed
as
in
FuNN
optimising
an
the
FuNN
analysis,
[14].
Acknowledgments This Fund
is
research
supported by
of the Foundation
following colleagues
implementation Zhang,
Brendan
WWW
site:
and
research
oi Research
took
grant UO()
Science
and
part in the discussioris
testing
Iflallet
a
2
Richard
and Steve
Kilgour,
Israel.
606 funded
FUNN/2
Robert
the
FUNN/2
Kozma,
for l\/lSWindows
Dr
Public
the
Technology (FRST)
during Dr
by
in New
Good
Science
Zealand.
The
design
and
Martin
Purvis, Dr Feng
is available
partly
free
from
in its
the
http://divcom.otago.ac.nz:SOO/COM/INFDSCl./KEL/fuzzycop.htm.
References
E. and Miki, T., A new AnewEffectiveAlgorithmforNeo T., Kusanagi, l-l., Uchino, [ll Yai’nalT.,Kusanagi,l-l.,Uchino,E.andMiki,T.,
Fuzzy Neuron
l\f1odel , in: Procecclings ()_fFIff/1[FSA World
16
Effective
Algorithm
(1993) lOl7~lO2O, C0/’zgre5.s’,
for Neo
[2] Hzishiyamzi, T., Furuhashi, T., Uchikawa, Y., A Neural
in:
Network ,
Proceedirnzs’Qfl’/162nd
Decision
I77l ‹f’lZ(£l’fOil(ll
l\/luking
Model
Using
a
Fuzzy
on C‘0r’y 2e1w1r;~e FL¢zz_v Logic & Neural
NffI’w01"ks, Iizu/ca, Japan, (1992) 1057-1060.
[3] Kasabov, N., Learning fuzzy rules and approximate reasoning in fuzzy neural
Fuzzy Sets hybrid SYSIICITISD,
and
networks
and
82 (2), 1996, 135-149 Sy.a’l1:ms,
and I()1\» Nefnvor/FuzzyS)» l( (]g£?l:‘r1gi1zee1 irz;g,irz;g, [4] Kasabov, N., F on/1dati0n.s’ Q/‘1\/ez;/’czl Fuzzy.5‘[(?l71SandKI S)» .5‘[(?l71S KI I()1\» l( (]g£? l:‘r1gi1zee1
Press, CA, MA, 1996
’1‘heM1 1‘
[5] Kzlsabov,
N,
Aciaptable connectionist
production systems. Neurocnmpuling
13
(2~4), 1996,
95~1 17
[6] Hauptmann, W., Heesche, K., Transformation,
in:
A
Neural
Net
Topology
Pr0c:eeding.s’ ofthe FUZZ-IEEE/IFES,
[7] Jang, R., ANFIS:
system, IEEE
Sets
71 (1), cz/1d.$‘ySrenz.s , 1995,
[9] Goldberg, D., Genetic
Wesley,
Fu2.zy-Neuro
Trans.
on
S)/st.,Man.
665-685
[8] Lin, C-’l‘., Lin, C-J., Lee, CT., Fuzzy acluptivelezirningcontrol Fuzzy
Bidireetionzil
YO1Japan,(1995)1511~1518. Japan, (1995) 151 1~1518.
adaptive netWork~bz1sed fuzzy inference
Cybernetics, 23(3), l\/1ay~June1993,
for
network
with
on-line
learning,
25-45
Algorit/zms
is
and ll/Iczchine S(f‘ØH’(,‘/I, O])Ifi’l’ZiZCIf1()l’Z
I.¢:arning, Addison
1989
[10] l\/lang, G.
Lan, H., Zlnuig,
Functions Men’11_>ers1’iip
A
Genetic-based
by Learning from
method
Exarnp1es ,
0/1 Neural] COl1]L(?l L?}‘1(_‘(3 If1f())"l71[lff()H (ICONIP PI UC7{?.5‘,\‘I’lIg
17
in;
of
Cleneruting Fuzzy
Proceedings Qf
95) Volume
Rules
and
I/zfernatio/zczl
One, 1995, 335
-
338
Kasabov,
[ll]
N.
Hybrid
Connectionist
Fuzzy Procluction
Systems
Towards
~
Building
and So/i Computizzg, 1:4 (1995) 351--360) Comprehensive AI, Il1If@Hfg(:’fl[/1ZlIf0l’H(11f()l’l
[12] Carpenter, G., Distributed Neural
[13] Kuszibov, N.,
using
the method
Neural
Networks
Kusaluov,
Information and
and Prediction
by
ART
in: Proceedings of ICN/\/’96, IEEE PlenaryPaneland Press. Volume Networlin:ProceedingsofICN/\/’96,IEEEPress.Volume
Special Sessions , l996,
[14]
Leairning, Recognition,
of
Investigating training
and
AdV2lI’lCC(1
Systems,
Panel
and
networks
by
Plenary
244-249
the
aclaptutiori and forgetting
in
fuzzy
neural
on zeroing , in: Proceea’ing_soft/ze I/ Ll‘( ‹I72L£l‘i()l’Z£ll C()l’lf ‹l’6 H(T(?
1’CNN 96, Plenar Y ,Panel and S P eciul Sessions
N.,
and ARTMAP
Neuro-Fuzzy Engineering
for
volume,l996,
118423
Building Intelligent Adaptive
in: I_,.ReznilV.Dimitrov,.l.KacprZylership scenters. variable
a
shoulder
the
ln the
case
is used
membership
functions
simultaneously,
membership
functions
for
function
as
triangular
in
and
of this
input
values
and
of the first
the
is the
maximum
and
last
(crisp values)
sum
given input
is
FLINN, the activation
19
degree that
ot
the
functions
grades
for
to
a
the center
determined
acts
activates
always equal
the
to
of
as
that
using
the fuzzifier.
the
for
a
Each
only
two
neighbouring
these
two
neighbouririg
a
triangie~shaped
1.
node
input
for
membership function
Hence, this layer
input signal
an
node
input weight represents
minimum
instead.
function
any
The
with
membership
is
transmit
modification.
particular membership function,
nicmbership
the
in the network
layers.
layer directly
particular
oi all the nodes
indicates
supcrscript
a
(Input Layer).
belongs to
values
For
i
are:
if ai
)
21
gain factor.
O Layer 4 (Action in
named
rules
layer
~- --~--~~
=
1+
those
fu77y
the (,0D(ill10[1
I
"
Ac!
>
< Ac!
is:
(IAM/,
--1-
21
coefficient,
and
21
crisp output
Qf Gmviry (COG)
function
’T jf
2
The
:
Net"
H
ACI
Wm
Layer)
is used:
O
S
Layer
(Output Layer): ali’
v U
0
"f ’"17’
>
Layer):
>