Imperfect domain theories can be directly translated into KBNN/TFS structure and then revised by neural learning. A consistency checking algorithm is proposed ...
Verifying Fuzzy
ain Theories Usi
Hahn-Ming Lee*, Jyh-Ming Chen, and En-Chieh Chang Department of Electronic Engineering National Taiwan Institute of Technology Taipei, TAIWAN E-mail: hmleeaet. ntit .edu.tw
Abstract
1. ~ n t r o d u c t ~ o n
In this paper, a fuzzy neural network model, named Knowledge-Based Neural Network with Trapezoid Fuzzy
[9,lo], with the ability of processing trapezoid inputs.
Set (KBNN/TFS), that processes trapezoid fuzzy inputs is
The new model is named Knowledge-Based Neural
proposed.
In addition to fuzzy rule revision, the model is
Network with Trapezoid Fuzzy Set, KBNNKFS in short.
capable of fuzzy rule verification and generation. To facilitate the processing of fuzzy information, LR-fuzzy
In addition to fuzzy rule revision, the new model is
interval is employed. Imperfect domain theories can be
symbolic rule base, rule verification can be conducted by
directly translated into KBNN/TFS structure and then
pattern matching between the premises and goal clauses
revised by neural learning.
A consistency checking
[1,11,13]. On a knowledge-based neural network with
algorithm is proposed for verifying the initial knowledge
symbolic inputs, such as [3,4,6], clustering of weight
and the revised fuzzy rules. The algorithm is aimed at
vectors and heuristics are often used to prevent from
finding the redundant rules, conflicting rules and
generating inconsistent rules. In [6], for checking the
subsumed rules in fuzzy rule base.
We show the
redundancy, each antecedent of a rule is examined to see if
workings of the proposed model on a Knowledge Base
it can be removed from the rule. In [3], clustering of
Evaluator (KBE). The result show that the proposed algorithm can detect the inconsistencies in KBNN/TFS.
weight vectors is used to avoid generating redundant rules. Besides, a consistent-shift algorithm is used for detecting
By removing the inconsistencies and applying a rule
the inconsistent connections in the neural network. In
insertion mechanism, the results are greatly improved.
[4], three KT heuristics are used for removing the
Besides, a consistent fuzzy rule base is obtained.
inconsistencies from the neural network. In fuzzy rule
In this paper, we extend our previous work, K B F "
capable of fuzzy rule verification and generation. In a
verification, we propose a fuzzy rule clustering method to find the inconsistencies, which include redundant rules, conflicting rules and subsumed rules, in fuzzy rule base. The rule generation translates the KBNN/TFS structure and fuzzy weights into fuzzy rules with certainty factors.
*
2. A Knowledgewith Trapezoid Fuzzy Set
Correspondence to this author.
0-7803-3645-3/96 $5.0001996 IEEE
1224
results, the initial knowledge should undergo a number of In "N/TFS,
S-neurons, G-neurons, and fuzzy
refining process [7,12].
A knowledge-based neural
network with learning ability is suitable for the task [2].
weights are used to form fuzzy rules. S-neurons calculate the firing degree of fuzzy rules, whereas G-
By combining the initial knowledge and neural learning
neurons derive the conclusions. Each input connection
from empirical data, the performance of the knowledge
of S-neuron represeiiits a condition of a fuzzy rule. Hence,
base can be greatly improved [3,5,14].
a fuzzy rule's preimise part is composed of all input connections of a S-neuron. For each rule's conclusion
3.1 Rule verification
variable, a G-neuron is used to represent it. For example, assume the existent fuzzy rules are listed as follows:
To make sure the extracted rules are consistent, a checking methodology is proposed. In [13], the process
Rule 1: If PAYOFF is Under-1 and TYPE is Useful,
of rule verification in symbolic rules is explained as
then WORTH is Negative.
removing the inconsistencies and incompleteness from the
Rule 2: If PAYOFF is Over-3,
rule base.
then WORTH is Moderate.
0
Rule 3: If WORTH is Negative and EMPLOYEE ACCEPTANCE
and
conflict rules:
two rules have the same premise
but contradict in their conclusions.
SOLUTION
0
redundant rules: two rules have the same
AVAILABLE is None and EASIER SOLUTION is None and TEACHABILITY is Frequent and RISK is
0
premise and the same conclusion. subsumed rules: two rules both fire in the
is
Positive
The inconsistencies may include:
presence of particular inputs; but one
Low,
contains more conditions than the other. We say the rule with more conditions is
then SUITABILITY is Good. Rule 4;If WORTH is High,
subsumed by the other.
then SUITABILITY is Poor.
Incompleteness in the rule base consists of: Fig. 1 shows the initial structure of KBNN/TFS generated by the fuzzy rules listed above. Where PAYOFF, SOLUTION
TYPE,
EMPLOYEE
AVAILABLE,
ACCEPTANCE,
EASIER
SOLUTION,
0
no rule will give the desired result in some cases. It means that there are missing rules for covering these cases.
missing rules:
The definitions illustrated above slightly differ with
TEACHABILITY, and RISK are input variables, and
those in other papers.
SUITABILITY is the output.
discussed the problem of circular rules.
WORTH is the hidden
conclusion. In thcse fuzzy rules, many linguistic fuzzy terms are used for each variable, such as Useful, Difficult, High,.., etc. They are applied to initialize the connection
Some papers, for example, Because
KBNN/TFS uses feed-forward structure, there is no link from output nodes to input nodes. Thus, circular rules do not exist in KBNNiTFS.
weights of KBNNKFS. For the learning algorithm, the readers may refer tci [IO].
3.2 Consistency checkhg
3. Fuzzy rule verification and refinement
For checking the inconsistencies in a fuzzy rule base, we check the similarities between the premises. We can
Before a knowledge base can produce satisfactory
1225
illustrate the clustering procedure as follows:
can help the convergence of neural learning.
However,
1. Set rule R1 to be the center of rule cluster GC1 .
due to the lack of nodes and connections to represent the
2. For each rule R, find a group whose center GCj is
rule base, the performance of the network may get stuck.
closest to R. That is, S(R,GCj) has the largest value among S g i GCi), i=l, 2 ,..., NE where Ng is the
It is the same situation as missing rules in symbolic rule bases.
To overcome this, when the performance get
stuck, an insertion mechanism is applied.
number of groups. 3. If S(R, GCj) is greater than a predefined threshold,
Intuitively, the deletion and insertion of rules can be
then R is belong to group j. Othenvise, form a new
combined. Upon finding the inconsistent rules, we set
group for R and goto step 2.
the values of their premises to unknown instead of
4. Add R to group j.
deleting them. We use (0.5, 0.5, 0, 0) as the value of
5 . If dim(R)=dim(GCj), i.e., the two rules have the same
unknown. By this way, the neural network can learn
attributes.
Suppose GCj and R contain n attributes.
new rules. However, if there is no inconsistency or the performance is still poor after the above-mentioned rule insertion, a more analytic method can be applied. The
They can be express as:
idea is to insert rules to cover the data that the network cannot correctly derive the desired outputs. Thus, in the inserted rules, the attributes and the output concepts ofthe As we mentioned above, in computing the similarity
data, which render large errors, are included. The fuzzy
between rules, we consider the premises of rules only. The att, and propn are the nth attribute of R and its
weights (values) of these newly inserted rules are set to unknown, and let the revision algorithm to learn them
corresponding property which is represented in LRm e fuzzy number. In this case, GCj needs to be
empirically.
adjusted. The new center of the rule cluster is
4. Experimental results To evaluate the performance of the proposed model, we implement Knowledge Base Evaluator (KBE) [8] in
KBNNKFS. KBE is an expert system that can evaluate whether the expert system is suitable for a company or not. Eight attributes, PAYOFF, PERCENT SOLUTION, TYPE, Where Nr is the number of rules in rule cluster j. 6. If dim(R)