VALENS: A Knowledge Based Tool to Validate and Verify an Aion Knowledge Base
Drs. Silvie Spreeuwenberg, Ing. Rik Gerrits, Drs. Margherita Boekenoogen1
Abstract. We present VALENS (VALid Engineering Support) for the validation and verification (V&V) of a knowledge base (KB). Validation techniques become more and more important when knowledge based systems (KBS) are widely used to automate business critical processes. The tool we present may be used during and after the development of a KBS. It focuses on a logical verification of the KBS. The techniques used to verify a knowledge base are: meta-rules, an inference engine to verify hypotheses posed by meta-rules (proof-by-processing) and meta information (provided by the user). VALENS is written in the same language as the knowledge bases it verifies: Aion.
1.
PROBLEM DESCRIPTION
In our everyday practice we encounter the following situation: we have a knowledge-based system (KBS) to determine the type of a concept. The KBS was tested and correct until a new rule is added to determine a new type. The results are completely different than expected and it took some time to find out that a form of circularity in the rules caused the problem. This is what could be called the butterfly theory in a KBS. Solutions to avoid this problem are validation and verification techniques (V&V). The ideas are incorporated into VALENS, a tool that applies validation and verification techniques to Aion knowledge bases.
1.1
What’s the need?
An iterative development cycle is preferred for developing knowledge based (KB) systems [1]. Decreasing the number of cycles between analysis, implementation and test decreases development time. Early detection of faults using V&V by detection of inconsistencies and incompleteness can contribute to this reduction. Furthermore the issue of assuring the quality of a KBS is becoming an increasingly important challenge as KB components are more and more often embedded within safety critical or business critical applications [2]. During maintenance a change in just one knowledge rule may introduce contradictions, redundancy or incompleteness in a complex chain of knowledge rules. Especially when people without a background in system programming or system analysis define
1
Employed by LibRT Redeployment Technology, PO Box 90359, 1006 BJ Amsterdam, Netherlands, Email:
[email protected]
and maintain the knowledge in a KBS, the support of a V&V tool helps them to cope with the complexity. In all the main phases of the knowledge engineering life cycle, validation and verification is an important aspect when it comes to delivering a high quality KBS. In the knowledge analysis phase the analyst validates the knowledge rules with an expert. In the implementation phase the logical correctness of the implemented rules is checked and in the test phase, test cases are used to validate the knowledge base. In each phase of this cycle, tools to support the validation and verification process can be used. For example:
– Data mining can be used to support the knowledge elicitation process by automatic rule generation. This data mining technique uses the knowledge in databases to validate expert knowledge by searching for evidence based on available data [3].
– Automatically checking a knowledge base for inconsistent, redundant or incomplete knowledge during development of a KBS.
– Automatic generation of test cases for a KB. These tools will increase the quality and reduce the ‘time to market’ of a KBS. This will be an important factor in the consideration of using knowledge-based technologies for automating business critical processes. In this article we will focus on the automatic support of validation and verification in the implementation phase of the knowledge engineering process.
1.2
Definitions of checks
When verifying the logical correctness of a knowledge base, it is checked whether the rules in a knowledge base are consistent, complete, not redundant and not obsolete. These checks may cover all rules in the knowledge base or a subset of the rules in a knowledge base. The control structure present in most knowledgebased systems facilitates the dynamical posting of rules or rule sets. If this is the case, these subsets have to be verified in isolation. The following paragraphs discuss the techniques used to verify a knowledge base. This is a high level description of the taxonomy of anomalies from A. Preece [6].
1.2.1
Consistency2
To verify consistency, one must demonstrate that for all inputs, the knowledge base produces a consistent set of conclusions, i.e., that for each set of possible inputs, all the conclusions can be true at the same time.
The next paragraphs describe the main concepts used in the three steps of the algorithm. For a more detailed description of the algorithm used in VALENS to detect anomalies the user is referred to Gerrits and Spreeuwenberg [12].
2.1.1 1.2.2
Circularity
To verify circularity, one must demonstrate that for all written attributes, the rule chain to derive the attribute does not contain duplicate rules.
1.2.3
Completeness
To verify completeness one must demonstrate that for all inputs, the knowledge base produces some conclusion and all inputs are used.
1.2.4
Redundancy
A knowledge base is redundant when the same knowledge is represented in different places or there exists knowledge that does not contribute to the output of the system (but can in fact be true knowledge).
Meta model
A meta model of the rule base to be verified is constructed. The meta model is object-oriented and represents the rules found in a knowledge base. The rules are parsed and divided into conditions and conclusions. Conditions are divided into clauses. Meta information is generated from the rules in the knowledge base and stored in the meta model. This meta information includes the defined input- and output of the knowledge base, the maximum and minimum values of variables and the values of an attribute. Information about attributes (meta information) is used to check completeness and redundancy.
2.1.2
Meta rules
Meta-rules work on the meta model and define the checks accomplished by the tool. We can distinguish two classes of metarules in our approach:
– Meta-rules that select a thesis, which has to be proved to detect a fault.
2.
APPLICATION DESCRIPTION
VALENS can be used by a developer after or during construction of a knowledge base or can be integrated in a tool that allows users to write their own business rules. The output of the tool is a document in which all invalid rules (combinations) detected are reported. Each fault is classified and explained. In the next paragraphs, the main algorithm and benefits in our approach to validation and verification is briefly discussed. Next, the development environment and the tool itself are discussed.
2.1
Verification algorithm
The verification algorithm that Valens uses performs three main steps: a)
Construction of meta model
In this step all rule constructs, necessary to reason about the rules in the knowledge base are instantiated. This step is performed on a “when needed” basis to reduce performance overhead. b)
Select potential anomalies
Potential anomalies are selected with the use of heuristics. These heuristics where designed as rules but are implemented as procedures due to performance considerations. c)
Proof anomalies
The potential anomalies, selected in the previously described step, are proofed with the use of the inference engine.
2
This is the terminology that is used in the application. Our users are more familiar with the terms “consistent” or “contradiction” and “incomplete” than the terms used in the article of A. Preece and many other scientific papers.
– Meta-rules that define a fault directly in terms of the meta model. The first class of meta-rules is used to detect faults, which may be hidden in a chain of logic. These meta-rules are used to find all kind of contradictions, circularity and redundancy. The second class of meta-rules is used to detect other faults in rules. For example: incomplete knowledge. The conditions of the meta-rules that detect faults in a chain of logic generally contain two parts: 1. The first part selects potential invalid rules, (this selection is implicitly defined in the condition of the meta-rule), 2. This leads to a hypotheses that is tested in the second part of the rule with the use of the inference engine (see next paragraph).
2.1.3
Proof-by-processing
When the hypothesis succeeds, the conclusion that the rule is invalid will be reached. The theses (potentially invalid rules) are proved by running the rules to be tested in a forward chaining mode, while providing them with the right truth-values (input). After the forward chaining process, the results of the inference engine are analysed. If the thesis indeed succeeds, the chain of rules, which causes the rule to fire, is returned. This process resembles the “saturation” process used by Nouira and Fouet [7]. Nouira and Fouet generate the largest possible number of properties for an object that would certainly appear during a real execution and then start forward chaining during
which it tries to fire constraints. In VALENS the heuristics defined in the meta rules search for the smallest number of properties for an object wherefore the potential anomaly can be proved. VALENS does not use explicit constraints. The knowledge contained in the constraints of Nouira and Fouet are integrated in the meta rules of VALENS. In contrast to the meta rules of VALENS, the constraints of Nouira and Fouet contain semantic knowledge. Semantic meta rules are foreseen but not yet implemented in VALENS (see 3.3 Future Enhancements)
2.1.4
Benefits of Proof-by-processing
The proof-by-processing algorithm has some benefits in comparison to the algorithms based on formal logic. The most important benefit is that proof-by-processing allows us to deal with predicate logic (i.e. functions are allowed to be used in rules). The functions that are used in the rules may contain procedural algorithms. An example of the use of functions in rules is given in the result window in figure 3. The rule “clients must be older than 15” uses the function or method (as it is called in an OO environment) “Age”. This method returns the number of years between the current date and the attribute “Birthdate” of the class “Relation”. This is not a predefined method, the user may design any method and use the method in rules. VALENS will find which attributes are used in the method (by recursively assessing the method and the methods that are being called by the method) and execute the method during the proof-by-processing algorithm. The consequence of executing the method on the execution of the rules will therefore be taken into account. A second important benefit of the proof-by-processing algorithm is that we can work in an object-oriented (OO) environment. As seen in figure 3 the rules work on attributes of the current instance. These attributes can be inherited.
Logical View
2.2
V&V in AION
The V&V tool VALENS is built in, and made for Aion83 (short: Aion) applications. Aion is a widely used commercial development environment for rule based systems and intelligent components. Some characteristics are:
– The inference engine supports rule and decision table processing in a backward, forward chaining or recursive forward chaining mode.
– The programming language is object-oriented. – Meta-programming features enable a programmer to obtain information about the state of the inference engine.
– The Callable Object Building System (COBS) feature allows one to automate all the functions a developer can use in Aion. Because Aion is cross-platform, cross-database, and cross middleware, organizations have tremendous flexibility in deploying these applications. Typical domains in which Aion applications are used are fraud detection, claim assessment and product configuration tasks. At the moment Aion does not include any rule verification or validation strategies but customers who maintain Aion rule bases with more than 100 rules have asked for these facilities. Integration in the Aion development environment is one of the future directions of VALENS.
2.3
The tool VALENS
The validation and verification application consists of three components: a user interface, the verification engine and a reporting component. The relationship between these components is given in the context diagram of figure 2.
Example Method body "AssignClient" rule "clients must be older than 15" ifrule current.age > 15 then current.assigned = true end
Relation -Birthdate : date -Name : String +Age(Date : date) : Integer
Report generator
rule "lanquage courses for young people" ifrule current.age > 20 and current.deficit = "language" then current.assigned = false end
Teacher
Client
-Expertise : String -Experience : Integer -NumberOfLessons : Integer +InferExperience() : Post rules +DeriveExperience() : RuleSet +DeriveExpertise() : RuleSet
-Degree : String -Assigned : Boolean -Course : List of string -Function : String +Intake() : Post rules +AssignClient() : Ruleset +DefineFocus() : Ruleset
Source code of Aion rulebased application
Example method body "Intake"
Verification mechanism
User of Valens
infer AssignClient DefineFocus backwardchain(->current.Assigned )
Valens User Interface
end
Figure 2. Context diagram of V&V tool
Figure 1. UML Class diagram
In figure 1 the UML class diagram of the knowledge and domain concepts seen in figure 3 is shown. The rules are modelled in methods and can contain a very rich OO language shown in the notation elements in the right hand side of figure 1. A third and very important benefit of proof-by-processing is that there can be no discrepancy between the run time logic and the logic used in the validation process because the inference engine used in the verification process is the same as the inference engine used to evaluate and fire the knowledge rules in the application.
The following steps are performed to verify a knowledge base: 1. Open and read an Aion application to create all the necessary objects in the meta model. 1.1 Select knowledge base to be verified. 1.2 Select rule sets in the knowledge base to be verified. 2. Start analysis process. 2.1 Start forward chaining process and execute the meta-rules. 3
Aion is a product and trademark from Computer Associates. Aion is a new version of AionDS.
2.1.1 Select potential ‘invalid’ rules through meta-rules. 2.1.2 Prepare the KB to verify the selected rule(s). 2.1.3 Run the KB and ‘catch’ the results. 2.2 Analyse the results. 3. Report the results. The user selects the knowledge base and rule sets within that knowledge base that need to be verified. The COBS functionality of Aion is used to open and read the Aion application. When there are potential ‘invalid rules’ detected during the verification process, COBS is used again to start the Aion application in a forward chaining mode to test the thesis. The meta-programming features of Aion are used to capture the results of the inference engine for analysing whether a thesis is satisfied, and to catch the chain of logic that has caused a thesis to be satisfied. Invalid rules are reported in a HTML document and on a result window (containing the same information). Each fault is classified and explained as shown in figure 3, which shows the result-tab for contradictory rules. The results window shows a general explanation of the anomaly and the conflict that is detected. A conflict is defined in [13] as a minimal set of rules, eventually associated to an input fact set, that is a sufficient condition to prove an ‘anomalie’. The rules, which have been detected as "contradictory", are shown in the list on the upper left-hand side of the window. When a rule is selected, the rule chain (the set of rules that caused the rule, which is contradictory to the selected rule, to fire) is shown in the list at the lower left-hand side of the window. The rule that is contradictory to the selected rule has the fault-icon (thunderbolt-sign) in this list. When a rule in the rule chain is selected, the rule text, rule premise and rule action are shown in the text boxes. The list in the middle
shows the truth values (input fact set) under which the particular contradiction is proved. Groups of knowledge rules and the status of faults found during verification and validation can be stored in a database to allow regression testing.
3.
APPLICATION BUILDING
VALENS is developed by LibRT BV. LibRT is a young, Dutch company which aims at helping customers integrate knowledgebased solutions into their own information-based solutions. We strongly feel that the art of knowledge engineering should be added to the default skills of any business analyst, application consultant or software engineer. To achieve this goal, LibRT develops products containing the knowledge of knowledge engineers, of which VALENS is an example.
3.1
Development and project team
It required about one man-year to develop VALENS. The project team consisted of the three authors of this article. They complement each other’s skills and had their own focus in the project. All three project members have a background in knowledge based system development and their respective focus in the project was:
– Functional requirements and the main concepts involved in our approach to validation and verification.
Figure 3. Result window of Valens
– Development and coding in Aion
4.
– Testing and exploitation of VALENS
The quality and maintainability of software is becoming a more and more important issue for managers of large scale IT projects. When rule-based techniques are used it is possible to detect anomalies automatically during the development process which can reduce development and test time and, even more important, increase the quality of the rule base. The risks of maintenance are reduced when the application is of high quality. Furthermore, the use of VALENS will speed up the learning curve of novice programmers with respect to the basics of good rule-base programming.
The project members worked two days a week on the development of VALENS.
3.2
Exploitation phase
The exploitation phase of VALENS is divided in three stages. In the first stage we have searched for collaboration with a commercial party that had already developed knowledge-based systems in Aion. We investigated the quality of their applications by using VALENS and delivered an evaluation report. The use of VALENS is free in this stage and only a couple of day’s consultancy time was charged for the discussion of the evaluation report. This stage gave us the opportunity to create references, test VALENS on real world applications and investigate the market for validation and verification tools In the second stage we will collaborate with a commercial party where we can integrate VALENS in the knowledge engineering life cycle. The use of VALENS will be free for a fixed period. Development and consultancy activities are charged for the integration and implementation of VALENS in the organisation. We expect that in this phase, the costumer will have specific wishes regarding the user interface and validation capabilities of VALENS. This second stage gives us the opportunity to communicate with the users of VALENS about the requirements for a user interface, system integration and domain specific validation. In the third stage, we will have developed a generic user interface for VALENS and VALENS will be sold under a license agreement4. At this moment, we are in the second stage of the exploitation phase of VALENS.
3.3
Future enhancements
At present, LibRT is working at a further refinement of the user interface component and the reporting component. However, it is our experience these components will always need to be adapted to the specific customer situation and needs. Further development of VALENS will involve the active support of users in solving faults detected and in suggesting solutions to the problem at hand. Automatic repair of faults will be supported but only by the explicit agreement of the user. We emphasize that the end-user has to be in charge in this problem solving process. Also, we would like to extend VALENS with a form of semantic validation by using domain specific meta-rules. These domain specific meta-rules are constraints or business rules. These business rules will be specified and defined by a business analyst or domain expert. The defined business rules can be stored and edited from within the V&V tool. These business rules will be handled in the same way as the meta rules described in section 2.
4.1
Presumably as an add-on in the Aion development environment.
Experience with real life applications 5
VALENS is tested using a knowledge base that contains a large veriety of anomalies based on the sample rule bases of A. Preece and extended with specific tests for the rich Aion rule language constructs and object orientation concepts. While VALENS was still in development (1.0 alpha version 2), Postbank Nederland BV became interested in the promise of a verification and validation tool for their AionDS assessment knowledge base. LibRT took the challenge because the VALENS kernel was finished and ready to be tested in a real business situation. The Postbank agreed to compensate for the time spent on presentation of the results in a two months pilot project and evaluate the results afterwards. The target knowledge base had been written in AionDS 7 and contained approximately 250 rules. Because the current version of VALENS is developed for Aion8 knowledge bases, the target knowledge base had to be converted to an Aion8 knowledge base. The conversion process was accomplished in a day using the Aion8’s conversion supporting tool ORCA. Important in this respect is that VALENS only needs a ‘valid rule base’ to verify the rules: no GUI or other interfaces need to be converted. LibRT got the first version of the customer’s KB to verify when the developing team of the Postbank had finished the rule base and the testing phase was at hand. Though VALENS can be applied earlier in the application development lifecycle, it was perfect timing for our alpha version: there would be a parallel verification and testing phase so the results of both processes could be compared. VALENS did not detect any real errors in the knowledge base. Though this might look disappointing, the testing phase neither did reveal any error that could have been detected by verification. Remember that technical or logical verification cannot take any domain knowledge into account that is not present in the knowledge base. Therefore, VALENS cannot be used for user acceptation tests. VALENS did find many redundant and obsolete constructs in the knowledge base. Some of these constructs were intentional, others were not, but everyone was impressed with the fact that VALENS was able to highlight these ‘points of interest’. VALENS proved to be of good use in maintaining the integrity of the functional specifications of the knowledge base and the realized (and revised!) knowledge base. Of course, VALENS detected problems that were not problems at all but thanks to this pilot project, Valens has become more mature and robust. 5
4
APPLICATION BENEFITS
Due to confidentiality contracts with our customers we are not able to show the real results obtained on the real knowledge bases
After three months, the pilot project was not continued because it became clear that the target knowledge base will probably never be converted to an Aion8 knowledge base. At present, we are starting a collaboration with a party that wants to use VALENS for the verification and validation of legalmodels. VALENS is used, in the first place, by the analysis and development team (approximately 5 people). The results of VALENS will be discussed by domain experts, which have a background in legislation. In the first results VALENS detected incomplete knowledge. The expert knowledge that was added to resolve the incompleteness resulted in the detection of a circular reasoning by VALENS, which was than again corrected by the domain experts.
4.2
Usability
In practice, VALENS proves not only to ensure a valid knowledge base, but also the validity of its documented functional specifications. When VALENS is applied to a KB, its functional specifications are, of course, the benchmark. However, if the KB is updated as a result of maintenance, without the functional specifications being updated, VALENS will reveal possible inconsistencies. Performance issues are important during the development of VALENS. In the last version of VALENS many performance improvements have been made by implementing the heuristics (meta rules) in a procedural way instead of a declarative way, selecting potential anomalies more intelligently, and by loading objects on a ‘when needed’ basis. During the use of VALENS performance is less important because users need not be present while the system runs. We advise the user to start VALENS after work or during lunch. Our test application contains 30 rules and does the search for redundancy, the most time consuming check in less than two minutes on a Pentium II. However, this time indication doesn’t say anything about another rule base containing 30 rules. It is not the number of rules that indicates the complexity of verification but the structure of the knowledge that is contained in the rule base. We are not aware of a measure indicating complexity of a rule base. Such a measure is needed when a prediction of the validation time for a given rule base has to be made. The result window shown in figure 3 is a sufficient explanation to be used by a knowledge engineer who has an understanding about rule based systems and an inference engine. However, the result window is not sufficient for explanation of the results to experts or inexperienced programmers. Therefore LibRT would like to improve the presentation of results with graphics, for example a visual dependency graph of the logical chain, and more explanatory text. At this moment VALENS is a stand alone application installed on a local PC which has a valid license of Aion. The user selects the Aion source code that he wants to verify. VALENS will show the user a view on the selected rule base in which the user can select rules to be verified and the different analyses to be performed on these rules. LibRT is intending to integrate VALENS in the Aion development environment for use during the development and maintenance of an application. This will relieve the programmer from opening Aion source code in the VALENS application. VALENS can also be integrated in a domain specific knowledge editor tool and used early in the analysis stage. One can
also think of integrating VALENS in a test-tool, however, we recommend to start with validation and verification techniques early in the development life cycle.
4.3
Comparison to similar systems
In the beginning of the ’90s, the universities devoted much attention to verification and validation of rule based systems. There were some tools developed to verify rule bases of which Preece [14] has given an overview and comparison. An even more extensive overview comes from Plant [15] who lists 35 verification tools build in the period 1985 – 1995. Most of the systems where developed at a university and it is hard to find out what the current status of those systems is. During the last 6 years that we have worked as knowledge engineers in making commercial rule-based systems, we never encountered any commercially available tool that is used in a modern programming environment. What happened with those systems? Some of them will still have a research status and are used to explore new research domains. For example, the COVER tool of Preece is evolved in the COVERAGE tool for verifying rule bases in a multi agent architecture. [16]. And the PROLOGA tool [17] is extended with intertabular verification [18]. But perhaps the ‘boost’ for verification tools failed to occur because the promise of expert systems failed in commercial environments. Another factor might be that not only business environments but also university research is driven by ‘hypes’ like ‘knowledge mining’, ‘knowledge management’ and then ‘intelligent agents’ which follow each other in such tempo that there is no time to pick the fruit of planted trees. However, the current need for more quality, less unpredictable software development projects and better maintainable systems might change the prospects of verification tools. The verification6 tools can be compared on a number of criteria. We will compare some of the widely known systems on the following criteria: 1. The anomalies that are detected by the tool 2. The language that is supported by the tool 3. The focus and behavior of the tool in the analyses or development phase of a system One criterion is formed by the anomalies that are detected by the tool. Some tools do not detect anomalies in a chain of logic, for example the Rule Checker Program (RCP) [19] and CHECK [20]. Others like RCP, CHECK and EVA [21] do not detect missing rules and unused literals. VALENS is complete with respect to the anomalies defined by Preece [14]. Another criterion is the language that is supported by the tool. Most verification and validation systems, which verify a knowledge base, cope with a restricted language, for example first order predicate logic [10][11] or formal specification language [9] as opposed to the rich language of a programming environment like Aion. There are also tools which have their own internal language defined and which, manually or automatically, translate diverse languages to the internal language. EVA is an example of a system with its own internal language and provides a set of 6
Tools that focus on validation or testing, like KVAT [22], are excluded from this definition.
translation programs that translate the rule languages of some expert system tools (for example, ART, OPS5 and LES) to an internal canonical form, based on predicate calculus. PROLOGA works the other way around, it allows a user to create and verify decision trees and then generate code in diverse programming languages (for example, Aion, Delphi and C++). COVER and VALENS work in the programming language they where developed with, which is respectively Prolog and Aion. It will be straightforward to automatically convert knowledge bases written using other rule-based expert system shells into the rule language used by the verification tool. However, if one wants to propose revisions of the rule base based on the verification results, the translation process will pursue extra complexity to this process. If tools are bounded to a (programming) language, they are likely to be integrated in an expert system shell or, as we call it today, development environment. VALENS will be integrated in Aion in the future. This makes the tool less generic; however, the core algorithms of VALENS are independent of the programming language7. The algorithms do pose assumptions on the ‘traceability’ of the inference engine. For the algorithms of VALENS to work correctly the inference engine needs to give information about the rule-firing network after a chaining process is finished. LibRT has plans to verify Java code, in combination with a Java rule engine, in the future. The last criterion for comparison of verification tools is their respective behavior in the analysis and development phase of a system. The work of Nouira and Fouet [7] concentrates on the analysis phase of a system but results in a valid and executable knowledge base. The work of van Harmelen [9] also concentrates on the analysis phase and validates formal specification language. The idea is that the formal specification has to be translated to a programming language to get an executable program. VALENS is a validation and verification component. With the integration of the tool in Aion, we focus on the development and maintenance stage of an application but its use is not restricted to this stage because the tool is developed as a component and can be integrated in an analysis support environment.
5.
DISCUSSION AND CONCLUSION
We have discussed a tool developed in and for Aion applications that supports the verification and validation of rules. The tool can be used during development and maintenance of the knowledge based system. Another application is the integration of our technique with a domain specific editor to support the definition of knowledge rules by a domain expert or business analyst. The tool uses meta-rules, meta information and the inference engine of Aion to accomplish this task. By using the same inference engine in the verification process as in the execution process of the rules, there can be no discrepancy between the run time logic and the logic used in the validation process. We think this tool will reduce the ‘time to market’ for knowledge-based systems, improve their quality and has added value during maintenance of applications by developers, especially for those who are not very familiar with the functionality of the application. A domain expert can not be expected to have an extensive background in logic, as a developer should have. 7
A Component Based Development (CBD) approach has been used during the project.
Therefore, the tool is a “must have” when domain experts define their own knowledge rules.
REFERENCES [1] Boehm, B.W., 1986, A Spiral Model of Software Development and Enhancement. Computer, 21, 61-72 [2] Ed P. Andert Jr., 1992, Automated Knowledge Base Validation, AAAI Workshop on Verification and Validation of Expert Systems (July 1992) [3] Vijay S. Mookerjee And Michale V. Mannino, 1997, Sequential Decision Models for Expert System Optimization, IEEE, 9, 675 – 687 [4] James A Wentworth And Rodger Knaus And Hamid Aougab, Verification, Validation, and Evaluation of Expert Systems, FHWA Handbook Volume 1 [5] Cragen And Steudel, 1987, A Decision Table Based Processor for Checking Completeness and Consistency in Rule Based Expert Systems, International Journal of Man Machine Studies. Vol. 26, 633638 [6] Preece, Shingal, 1994, Foundation and Application of Knowledge Base Verification, International Journal of Intelligent Systems, 9, 683 – 701 [7] Rym Nouira, Jean-Marc Fouet, 1996, A Knowledge Based Tool for the Incremental Construction, Validation and Refinement of Large Knowledge Bases, Workshop Proceedings ECAI96 [8] R. Nouira, J.M. Fouet, 1996, A Meta-Knowledge Based Tool for Detecting Invalid, Conflictual and Redundant Rules, AAAI-96, Proc. Ninth National Conference on Artificial Intelligence [9] F.V.Harmelen, 1995, Structure Preserving Specification Languages for Knowledge Based Systems, International Journal of Human Computer Studies, Vol. 44, 187-212 [10] Alon Y. Levy And Marie-Christine Rousset, 1996, Verification of Knowledge Bases on Containment Checking, Workshop Proceedings ECAI96 [11] Z. Bendou And M. Ayel, 1996, Validation of Rule Bases Containing Constraints, Workshop Proceedings ECAI96 [12] R. Gerrits, S. Spreeuwenberg 1999, A Knowledge Based Tool to Validate and Verify an Aion Knowledge Base, Validation and Verification of Knowledge Based Systems, Theory, Tools and Practice, 67 – 78 [13] F. Bouali, S. Loiseau, M.C. Rousset, 1997, Revision of Rule Bases, Proceedings Eurovav 97, 193 – 203 [14] A. Preece, 1991, Methods for Verifying Expert System Knowledge Bases. [15] Robert T. Plant, 1995, Tools for Validation & Verification of Knowledge-Based Systems 1985 – 1995 References, Internet Source [16] N. Lamb And A. Preece, Downloaded: 01-05-2000, Verification of Multi-Agent Knowledge-Based Systems, Internet Source [17] J. Vanthienen, 1991, Knowledge Acquisition and Validation Using a Decision Table Engineering Workbench”, World Congress of Expert Systems, 1861 – 1868 [18] J. Vanthienen, C. Mues, G. Wets, 1997, Inter-Tabular Verification in an Interactive Environment, Proceedings Eurovav 97, 155 – 165 [19] M. Suwa, A.C. Scott, E.H. Shortliffe, 1982, An Approach to Verifying Completeness and Consistency in a Rule-Based Expert System, AI Magazine, Vol. 3, Nr. 4 [20] W.A. Perkins, T.J. Laffey, D. Pecora, T.Nguyen, 1989, Knowledge Base Verification, Topics in Expert System Design, 353 – 376 [21] C.L. Chang, J.B. Combs, R.A. Stacowits, 1990, A Report on the Expert Systems Validation Associate (EVA), Expert Systems with Applications, Vol. 1, Nr. 3, 217 – 230 [22] O.J. Mengshoel, S. Delab, 1993, Knowledge Validation: Principles and Practice, IEEE Expert, Nr. 6, 62 - 68