15th JISR-IIASA Workshop on Methodologies and

1 downloads 0 Views 560KB Size Report
Aug 28, 2001 - in Multiexchange Networks - from a Static to a Dynamic Formulation. 11 ..... of possibly different types where again these models, tools, .... of Slater's and Pareto's sets are investigated in the paper applying to the ... On the other hand, the utilization of dynamic routing (which involves the calculation of time-.
15th JISR-IIASA Workshop on Methodologies and Tools for Complex System Modeling and Integrated Policy Assessment

27 – 29 August, 2001

Abstracts

International Institute for Applied Systems Analysis Laxenburg, Austria

Contents M. Arakawa, T. Yagi, H. Nakayama, H. Ishikawa, Y. Yun, Measurement of the Technical Growth of Products By Using Data Envelopment Analysis 1 J. Bartnicki, J. Saltbones, A. Foss, Harmonisation and integration of national emergency management models for nuclear accidents in the frame of the ENSEMBLE project

2

A. Beulens, H. Scholten, Model solving software and modeling paradigms: A Challenge for developers 4 T. Bewszko, Multi-criteria analysis of options of energy supply for residential customers 9 S. Churkina, Selecting Effective Points by Means of Aspiration Levels Method 10 J. Climaco, J. Craveirinha, C. H. Antunes, On Multicriteria Routing Problems in Multiexchange Networks - from a Static to a Dynamic Formulation 11 T. Ermolieva, Y. Ermoliev, J. Linnerooth-Bayer, I. Galambos, The Role of Financial Instruments in Integrated Catastrophic Flood Management 13 M. Funabashi, K. Kawano, S. Sameshima, AYA: Autonomous Networking for Creating Services on Super Distributed Objects 14 J. Granat, Multi-objective modeling of the decision making problems using SAS software 16 M. Grauer, T. Barth, metal forming

Distributed scalable optimization for intelligent sheet 17

J. Gu, X. Tang, Metasynthesis knowledge system for complex system

18

K. Hayashi, Mapping Environmental Effects of Agricultural Systems

19

A. Hiramatsu, T. Nozaki, A Support System for Requirement Extraction from Bulletin Board System on WWW 20 O. Hryniewicz, On some optimisation problems for imprecisely defined objective function 21 S. Kaden, M. Grauer, Decision making for groundwater protection and remediation planning 22 L. Kru´ s, Cost-benefit-risk analysis of innovation project - a case study.

24

D. Loucks, A Multi-Objective Framework for Evaluation of Simulated Water Management Alternatives. 25 M. Luptacik, Eco-Efficiency Analysis of an Economy

27

M. Makowski, Advanced Modeling Support: A Draft Requirement Analysis

28

W. Michalowski, S. Rubin, R. Slowi´ nski, S. Wilk, Teletriage System for PalmPilot

32

-i-

Mobile DSS: A Medical

Aug-28-2001, 22:24

Y. Nakamori, Y. Sawaragi, Modeling

Knowledge Management for Complex Systems 34

H. Nakayama, T. Asada, Support Vector Machines using Multi-Objective Linear Programming 35 W. Ogryczak, M. Zawadzki, Conditional Center: A New Solution Concept for Location Problems 36 J. Ortega, J. Torres, R. Gasca, A new methodology for analysis of semiqualitative dynamic models with constraints 39 Z. Pawlak, Data

Rough Set Theory A New Approach to Reason From Imperfect 41

H. Scholten, S. Osinga, How to support GMP in model-based DSS

43

R. Steuer, Efficient Sets and Surfaces in Multiple Criteria Optimization, Data Envelopment Analysis, and Portfolio Theory in Finance 46 H. Tatano, Y. Shoji, N. Okada, A Multi-Regional General Equilibrium Model Taking Account of Disaster Risk 47 J. Wessels, N. Litvak, I. Adan, H. Zijm, A greedy heuristic for carousel problems 48 A. Wierzbicki, Information and knowledge society, the role of intuition in decision making and a rational theory of intuition 49 Y. Yun, H. Nakayama, M. Arakawa, H. Ishikawa, Multiple Criteria Decision Making using Generalized Data Envelopment Analysis 50 List of Participants

51

Note: The abstracts have been processed automatically using the abstract forms e-mailed by the authors. Only one substantive type of modification has been applied, i.e., in a few cases the co-author has been named first, if only he/she will participate in the Workshop. Aug-28-2001, 22:24

- ii -

Measurement of the Technical Growth of Products By Using Data Envelopment Analysis Masao Arakawa Kagawa University Toshiro Yagi Kagawa University Hirotaka Nakayama Konan University Hiroshi Ishikawa Kagawa University Ye Boon Yun Kagawa University

Keywords: Data Envelopment Analysis, Estimation of Products, Measurement of Product Growth

In design of products especially for public use such as automoble, and PC, the requirements become widely spreaded. Thus, we should have to estimate products by using multi-objective optimization, and it should be estimated in scalar value. In such cases, Data Envelopment Analysis (DEA) works very well. In this study, we analyze the growth of frontier of products and try to estimate the future requirements for the product design. In order to demonstrate the proposed method, we apply it to the growth of Japanese Car for five years. From those results, we show the growth of the frontier for specific class of cars, and also we show the shifts of requirements by shifts of frontier for other specific class of cars.

-1-

Aug-28-2001, 22:24

Harmonisation and integration of national emergency management models for nuclear accidents in the frame of the ENSEMBLE project Jerzy Bartnicki and Jorgen Saltbones

and Anstein Foss

Norwegian Meteorological Institute

Keywords: Emergency, nuclear safety, modelling atmospheric transport

For the unfortunate case of a severe, large-scale nuclear accident, most of the countries have developed national models for simulating atmospheric transport and deposition of radioactive debris. However, when simulating emission, atmospheric transport and deposition of radionuclides from such an accident, there will be inevitable differences in the long-range national forecasts due to different national weather prediction models and different structures in the national dispersion models. These differences may cause problems at the European level, as national emergency management strategies based solely on national forecasts may be not coherent with those in neighbouring countries. The ENSAMBLE project addresses harmonisation and coherence issues for emergency management and decision-making in relation to long-range dispersion modelling. The main objectives of the project are: (1) To develope new WEB-based communication software procedures and software tools for real-time reconciliation and harmonisation of disparate dispersion forecasts coming from meteorological and other emergency centres across Europe during an accidant. (2) To make software tools available to participating national emergency centres, which may choose to integrate them directly into operational emergency information systems, or possibly use them as a basis for future system development. (3) To build a database of experience to help modellers, national meteorological offices, decision makers and their advisers to gain an intuitive understanding of ’normal’ agreement/disagreement between forecasts and a feeling for the uncertainty. The ENSAMBLE products will be maintained and disseminated to its users via consortium formed by the substantial number of national and overseas emergency and meteorological forecasting centres. The main goals of the project will be achieved by broad European implementation of easy-to-use, practical WEB tools, which will combine - in real time - all available long-range dispersion foreacasts into a single and coherent European ”ENSAMBLE” forecast. The list of pariticipants of the ENSEMBLE project includes 16 institutes, mostly from the European countries but also from the United States and Canada. Norway is represented by the Norwegian Meteorological Institute from Oslo. Project duration is three years and during this period a series of 10 experiments will be performed. Each experiment will requaire a simulation of the nuclear accident in Europe by all project participants. To initiate simulation, an ALERT message will be sent to all participant by e-mail at arbitrary time. This message specifies: (1) geographical coordinates of the release point, (2) time and date of release, (3) Releas rate, (4) duration of releas, (5) height of the emission source, (6) nature of release, (7) Isotope released, and (8) time horizon of the forecast. In response to the ALERT message the results of each simulation will be submitted on-line to the project centre located in the Joint Research Centre of the European Commission in Ispra, Aug-28-2001, 22:24

-2-

Italy. The following information will be submitted: (1) Three dimensional modelled concentration field at 5 levels (0, 200, 500, 1300 and 3000 meters above the ground), (2) cumulative concentration at the ground, (3) cumulative dry deposition at the ground, (4) cumulative wet deposition at the ground, and (5) precipitation. The domain of the experiment coveres the area of the entire Europe. The results of all simulations, as well as their analysis will be available on-line on the dedicated WEB pages at Ispra. In the paper we will describe the most interesting results of the first ENSEMBLE Experiment, which will take place in April 2001. We will pay a speciall attention to the contribution of the Norwegian Meteorological Istitute.

-3-

Aug-28-2001, 22:24

Model solving software and modeling paradigms: A Challenge for developers Adrie J.M. Beulens and Huub Scholten Wageningen University, Applied Computer Science Group 6703 HB Wageningen, The Netherlands

Keywords: ASP, modeling paradigms, distributed systems, hybrid DSS

1. Introduction Growing numbers of organizations in all sectors of industry, public administration, business consultants and knowledge institutes acknowledge the need for and actually use of Model Based Decision Support Systems (MBDSS). Realizing such functionality next to and integrated with their Enterprise Resource Planning (ERP), administrative systems and external data sources may support them in solving complex problems they are confronted with. In the market place names used for these types of applications encompass Advanced Planning and Optimization applications (APO’s), Planning Systems, Data Mining and OLAP applications. For some examples of APO’s, DSS and associated requirements and developments over time we may refer to Beulens (1988, 2001), Holsapple and Whinston (1995), Van der Vorst et.al. (2000), and SAP (2001). The business problems addressed belong to a wide range of application domains and may include, but are not restricted to (strategic) planning problems in finance, logistics, marketing, production, procurement, distribution, R&D, design and development of products and production systems, environmental systems analysis, and telecommunications. Supporting decision making often requires a comprehensive analysis of decision scenario’s using a (set of) model(s) that adequately represent relations between possible decisions and expected results of their implementation. As a consequence, each decision problem scenario may require the use of a specific model (or a set of models), each with their associated model type. In turn, these models with their associated model type can be generated, solved and/or evaluated using standard techniques that have been developed for various types of models. Techniques that are widely and frequently used in many applications for model-based decision support. In order to use effectively and efficiently such MBDSS it is evident that a number of performance requirements must be satisfied. These encompass that systems should: • effectively and efficiently support the (planning) work of the user; • be integrated with the information system infrastructure of the organization or even network of organizations in order to allow the use and exchange of business data; • be user friendly; • be developed, upgraded and maintained in an environment governed by the dynamics of the problem domain, the organization, ways of working and infrastructure in a cost effective Aug-28-2001, 22:24

-4-

manner. This is clearly a requirement that needs to be addressed by the developers of these MBDSS as well as by the people that design and build infrastructures and tools for MBDSS.

2. What is the practical situation? There is a growing number of MBDSS that become available and are used in the market. Many organizations have modeling knowledge and develop models with associated software tools for their analysis. These allow users to find better solutions to real-life problems in comparison with solutions that can be found without model-based problem analysis. Further on we use the word ”solvers” for various tools that support analysis of a model using various techniques, such as simulation, single-criterion optimization, multi-criteria model analysis, soft simulation, etc. Solvers are therefore core components of problem solving systems. As examples we may mention blending models with LP solvers to help organizations to optimize recipes in the feed and food industry, route planning models with routing software for distribution companies, production planning models with either or a combination of simulation software and LP software. There are many more in different functional areas, especially in the financial and marketing area.

3. What is the problem? We all know from own and reported experience that it is still very difficult to build, upgrade, maintain and implement MBDSS that are well integrated into the (Distributed) Information System Infrastructure of an organization and are really used and support actual decision making. For organizations in need of MBDSS, for service providers, for users and developers we describe some problems or challenges they are currently still confronted with: • For the user of MBDSS it must often be recognized that the functional and performance requirements mentioned above are not really satisfied. This may be due to inadequate models and modeling support, data, and inadequate solvers. Many planning problems are for instance solved using spreadsheet models that get their data from corporate databases, where the quality of the support is minimal. • For the developer the problem is to adequately build and maintain such dynamic problem solving systems. Current tools and techniques, model banks, solvers and data sources prohibit this for a number of reasons. First of all transparent access to knowledge, model, solver and data sources is lacking. Secondly adequate knowledge to use these sources is not readily accessible. • In this context it must be mentioned that in practice, because of the nature of the applications, we often find that user and developer may be the same or that they closely cooperate. A production planner is active in the development of his planning support system. The researcher and financial analyst build and use their own systems, etc. • Technically for the developer again, with respect to difficulties encountered, we have to deal amongst others with the following problems: – For many types of problems there may be alternative models of different types with their solvers that can be employed. Then the problem arises whether the developer has adequate knowledge of and access to this knowledge in order to determine what to use best. -5-

Aug-28-2001, 22:24

– Most of the models should be analyzed in a comprehensive way using various solvers that support different types of analysis, which is necessary for getting new insights and more effective results. But applying other tools to solve the models is a laborious and cumbersome process hindering decision making speed and quality, if these tools are accessible at all. Further it requires a lot of expertise and substantial resources. – Finally, solving many problems really requires the use of a set of interrelated models of possibly different types where again these models, tools, techniques and expertise are distributed, hard to obtain and hard to combine to applications that problem owners can use for decision support.

4. Reflection In previous sections we have reflected on experiences with MBDSS and the way these systems are currently being designed, built, maintained and used and we have arrived at many problems and challenges associated with building and using these systems. Regardless many developments during the last decades, we have concluded that it is still very difficult to build and implement good MBDSS. Systems that are well integrated into the Information System Infrastructure of the organization and that are really used and support actual decision making processes. As a consequence of our increased insight with respect to performance and infrastructure requirements in the area of MBDSS and our increased insight with respect to the availability an usability of rapidly changing technologies, we feel it is appropriate to draw your attention to many current challenges concerning the design, construction, implementation and exploitation of successful hybrid, model based and, if necessary, distributed DSS. We have elaborated on these challenges in Beulens and Scholten (2001) along 4 dimensions: 1. Managing development and exploitation 2. The human factor 3. Modeling paradigms 4. Hybrid DSS environment. In this paper we will only address some aspects of the third dimension just mentioned.

5. Modelling paradigms We have mentioned in previous sections of this paper that many developers have experienced the need to use a variety of types of models and associated solvers in MBDSS to allow them to solve a practical problem at hand. Examples are combinations of for example mixed integer linear programming, simulation models and associated databases with their incorporated data models. In this context we have to do with questions associated with: • Which paradigms to use? Each modeling paradigm is connected with a way of thinking, theories, ontology, examples of use, tools and techniques. As a consequence such a paradigm has a certain representation power. See for instance Borst (1997) and Houba et.al. (2000). • What is the fitness for use of a paradigm in relation to the problem and the problem domain. In a practical problem situation we are confronted with the question to find appropriate paradigms to provide us with the necessary modeling or representation power. Aug-28-2001, 22:24

-6-

• Which alternative paradigms may be used and have to be combined in models and submodels in one system? What does this entail for the exchange of information between instantiated models used in systems? • How to manage and control the relationships and data exchange between (sub-) models? In a network of sub-models we have to be able to work with a variety of precedence relations. Further these may be based on different modeling paradigms. • If there are paradigms with more or less equal representation power related to the problems to be modeled, then we have to make choices. This choice may depend for example on the familiarity of the modeler with the paradigms, the availability of easy to use generators and solvers and finally the appropriateness of the paradigm in relation to the problem. • How to deal with the body of semantic knowledge associated with the problem domain, which the paradigm and its ontology allow us to capture in models. • How to use various data sets with their explicit data models as input and or output of (sub) models and for process control in DSS. In this context it may be mentioned that each type of data model has its own associated ontology and semantics.

6. Combining models of different types in hybrid MBDSS In previous sections it has been described extensively that many practical MBDSS may involve the need to use a variety of (sub) models where these models may be of different types. Further we have elaborated upon questions associated with actually using a variety of models in one system. Some things are evident in this context: 1. The combination of models of different types with their associated model generators and solvers lead to hybrid MBDSS environments that must give transparent access to models and tools per model type mentioned. 2. In a system, models must be able to share data sources and adequately exchange data, again in a transparent manner. For this transparent access and exchange of models, tools and data we can adopt, generally speaking, two approaches: 1. The common representaion format approach. A common model and data representation format that is being used for all models used in one system. In turn these models of different types, that use the same model and data model representation format, have to be interfaced to their associated tools on one hand and to the system data bases on the other hand. We must hereby ensure that all models comprise a data model (DM) that is part of a reference DM of the whole system. This approach guarantees that data may be exchanged in a semantically correct manner. Further we need to devise mechanisms to be incorporated in the database, models and or process model of the system to ensure that precedence relationships between (sub) models are respected. 2. The pragmatic approach. A pragmatic approach in this area is as follows. We must ensure that all models comprise a data model (DM) or are extended with a DM that is part of a reference DM of the whole system. This approach guarantees that data may be exchanged in a semantically correct manner. Further we need to devise mechanisms to be incorporated in the database and or process model of the system to ensure that precedence relationships between (sub) models are respected. The advantage of this latter approach over the first one is that relatively little effort is needed to incorporate existing models and their solvers into new MBDSS. -7-

Aug-28-2001, 22:24

In a practical context for both efficiency and effectiveness reasons we may have to strive for systems that cater for both possibilities. This to arrive at an environment that allows for: 1. Efficient evolutionary development of new models, model bases and tools using the common representation format. 2. Rapid incorporation of existing knowledge in the form of model bases and tools that become available. In our presentation we will elaborate on the need for combining models of different types into one MBDSS and in particular pay attention to the pragmatic approach that will allow us to do that.

7. Concluding remarks In this paper we briefly discussed some recent developments in the area of MBDSS. We have arrived at a variety of questions, problems and challenges facing users, developers and service providers in this area. Further we discussed in particular the issue of combining different models with associated different modeling paradigms into one MBDSS with some associated consequences for the developer of such systems. Finally, we paid attention to a pragmatic approach for the transparent sharing and exchange of data between models used in one system.

References [1] Beulens, A.J.M. & Nunen. J.A.E.E. van, (1988). The use of expert system technology in DSS. Decision Support Systems 4 (1988) 421-431. [2] Beulens, A.J.M. & Scholten, H., Challenges to model solving software in the Internet ERA, EUROSIM conference, Delft, 2001. [3] Holsapple, C.W. & Whinston, A.B., (1995). Decision Support Systems, West Publishing Company, New York. [4] Houba,I.H.G., Hartog, R.J.M., Top, J.L., Beulens, A.J.M. & Berkel, L.N., (2000). Using recipe classesfor supporting detailed planning in food industry: A case Study. European Journal Of Operational research, 122, 367-373. [5] Vorst, J.A.G.J van der., Beulens, A.J.M. & Beek, P. van., (2000). Modeling and simulating multi-echelon food systems. European Journal Of Operational research, 122: 354-366. SAP (2001). WWW.SAP.com.

Aug-28-2001, 22:24

-8-

Multi-criteria analysis of options of energy supply for residential customers Tadeusz Bewszko Faculty of Electrical Engineering and Computer Science, Rzeszow University of Technology, Poland

Keywords: Energy systems planning, energy resource allocations, modeling for multi-criteria analysis, model-based decision support

The presentation deals with local energy planning for new residential customers. The scope of the considered problem is a single house, or a multifamily block, or group of houses/blocks, which have to be supplied in energy. In an early stage of house planning a particular mix of energy carriers and technologies needs to be chosen. Every choice is connected with a specific investment cost, annual operation and maintenance cost, emissions of COx, NOx, SOx, PM, the total system efficiency (including the efficiency of the production, distribution, and end-use consumption). This problem is complex and goals are conflicting. The higher is environment protection, and the more comfortable devices are used, the higher are also the costs. The traditional approach to planning residential energy systems is based on a simple analysis of a finite number of alternatives. More detailed analyses are too complicated for individual customers, building designers and managers. Without help of decision support tools, the users are not able to analyze consequences of using various types energy supplying systems. The presented method of multi criteria analyses provides various Pareto-efficient solutions that correspond to various trade-offs among criteria, and can be used by individual customers, designers and managers, and also planners and policy makers, to select energy supplying option, which is the closer to his/her preferences. The decision making process based on more complex analyses and more outcomes are taken into account. Various solutions can be analyzed by different decision makers, although the core model is always the same. From the point of view of a single customer, the decision is usually a trade off among economical criteria, but some users, for example who are not indifferent on environmental protection, can also take into account environmental criteria. Local policy planners have to analyze the problem broader, from the point of view of whole local (or regional) society and make decision as a trade off among economical, ecological, and energy safety criteria. The presentation will start with the problem formulation. Then the model specification, multi-criteria approach to model analysis, and discussion of preliminary results will be outlined.

-9-

Aug-28-2001, 22:24

Selecting Effective Points by Means of Aspiration Levels Method Svetlana Churkina Faculty of Applied Mathematics and Cybernetics, Moscow State University, Moscow, Russia

Keywords: Multiple criteria optimization, decision maker, effective point, aspiration levels, parameterization, approximation, interactive procedure, decision support system

Multiple criteria optimization methods usually use weighting coefficients to introduce preferences of a Decision-Maker. This paper presents interactive procedure where Decision-Maker’s whishes are expressed in the form of aspiration levels. The aspiration levels are values of objectives, which the Decision-Maker would like to achieve taking into account real situation. First, the Decision Maker specifies aspiration point t. Second, the points, which are the nearest to t in the sense of a Chebyshev norm, are searched in the set of attainable objectives. Let all such points refer to as F(t). It is proved in the paper that these points appeared to be half-effective. Necessary condition of half-efficiency is also proved. If we maximize the sum of criteria in the set F(t) we come to an effective point (the only one). The Decision-Maker analyzes this point and if he is not satisfied he propose another aspiration point and so on. The questions about parameterization and approximation (in the sense of a Hausdorff norm) of Slater’s and Pareto’s sets are investigated in the paper applying to the presented procedure. Thus, it is shown that an aspiration point is a good parameter for examining half-effective and effective sets At the present moment I am working on the creation of the computer decision support system. This system presents the aspiration levels method. Auxiliary parametric single-objective problems for each aspiration point t are solved by means of Excel’s Solver.

Aug-28-2001, 22:24

- 10 -

On Multicriteria Routing Problems in Multiexchange Networks - from a Static to a Dynamic Formulation Joao Climaco FEUC and INESC-Coimbra Jose Craveirinha DEE-FCTUC and INESC-Coimbra Carlos H. Antunes DEE-FCTUC and INESC-Coimbra

Keywords: Routing in telecommunication networks, multicriteria analysis

Routing is a key element of any multiexchange telecommunication network functional structure which has a decisive impact on network traffic performance and cost. A routing method is primarily concerned with the definition of a path, or set of paths, between a pair of exchanges (nodes) satisfying potentially conflicting criteria. The formulation of such criteria, expressed through appropriate objective function(s) and constraints, depends on the nature of the network in terms of provided services and information transfer modes. The increasing demand for a wide range of network services, namely multimedia applications, leads to the need of modern multiservice network functionalities, dealing with multiple and heterogeneous grade of service (GoS) requirements. This in turn, when applied to routing network mechanisms leads to the need of selecting network paths satisfying certain GoS requirements and seeking simultaneously to optimize the associated metrics (or a sole function of different metrics). Therefore we think that there are advantages in considering this class of routing problems as multicriteria routing problems. Note that in a multicriteria context involving multiple, potentially conflicting, incommensurate objective functions, the concept of optimal solution in single objective problems gives place to the concept of nondominated solutions (feasible solutions for which no improvement in any objective function is possible without worsening at least one of the other objective functions). Multicriteria routing models thus enable to grasp the trade-off among distinct quality of service requirements by treating in a consistent manner the comparison among different routing alternatives. The simplest type of routing problem in multiexchange networks, which may be designated as static routing problem, involves the calculation of a path (or path set) between every pair of nodes for nominal stationary network conditions, that is with fixed traffic offered between the exchanges and fixed coefficients of the optimization problem. A multicriteria formulation and an adequate algorithm for this type of routing problem was proposed in (1). This formulation is a bicriteria shortest path problem which uses a particularly efficient algorithm based on a k-shortest path algorithm. On the other hand, the utilization of dynamic routing (which involves the calculation of timevariant paths or ordered path sets between pairs of nodes, as a function of relevant measurable network characteristics) in various types of networks is well known to have a quite significant - 11 -

Aug-28-2001, 22:24

impact on network performance and cost, namely for time variant traffic patterns, overload and failure conditions. A new type of dynamic routing method based on a multicriterion formulation and dependent on periodic network parameter measurements was developed in (2). This method involved the development of a new version of the bicriteria shortest path algorithm, the definition of a routing control architecture and the models for estimating the time variant model coefficients, namely using the concept of implied costs on the arcs. In this communication we outline the evolution of this stream of research, emphasizing its main features and potential practical applications in telecommunication networks.

References [1] C. H. Antunes, J. Craveirinha, J. Climaco and C. Barrico, ”A Multiple Objective Routing Algorithm for Integrated Communication Networks”. In P. Key and D. Smith, editors, ITC16 Teletraffic Engineering in a Competitive World, vol.3b, pp1291-1300 Elsevier Science B.V., June 1999. [2] J. Craveirinha, L. Martins, T. Gomes, C. H. Antunes and J. Climaco, ”Formulation of a Multiple Objective Dynamic Routing Method Using Implied Costs - Architecture and Algorithms”, Research Report ET-N8-3, INESC-Coimbra, Feb. 2001.

Aug-28-2001, 22:24

- 12 -

The Role of Financial Instruments in Integrated Catastrophic Flood Management Tatiana Ermolieva IIASA, Laxenburg, Austria Yuri Ermoliev IIASA, Laxenburg, Austria Joanne Linnerooth-Bayer IIASA, Laxenburg, Austria Istvan Galambos VITUKI Consult, Budapest, Hungary

Keywords: Catastrophic floods dependent losses, GIS-based modeling, nondifferential stochastic optimization, insolvency constraint, VaR risk measure

This paper examines the specifics of the catastrophic risk management problem: humanmade risks, highly mutually dependent losses, the lack of information, the need for long-term perspectives and geographically explicit models, the involvement of various agents such as individuals, governments, insurers, reinsurers, and investors. As a concrete case we consider a pilot region of the upper Tisza river, Hungary. Traditionally, the insurance industry pools its exposures through reinsurance contracts written on the basis of rich historical data. This is not possible with regard to rare catastrophic risks with dependent losses of high consequences. This calls for the use of sophisticated models, the investigation of alternative risk-spreading instruments and analysis of complex interplays between different ex-ante and ex-post risk reduction and risk spreading measures. Special attention is given to the evaluation of a multipillar flood loss spreading program involving a partial compensation by the central government, the mandatory insurance on the basis of location-specific exposures, and a contingent credit. We discuss appropriate GIS-based catastrophe models and numerical experiments. To analyze the stability of the system, we use the strong connection between the nondifferential (possibly convex) stochastic optimization and such indicators as the value at risk and bankruptcy.

- 13 -

Aug-28-2001, 22:24

AYA: Autonomous Networking for Creating Services on Super Distributed Objects Motohisa Funabashi and Katsumi Kawano

and Shigetoshi Sameshima

Systems Development Laboratory, Hitachi, Ltd.

Keywords: Ubiquitous Computing, Super Distributed Objects, Middleware

1. Background Presently we are surrounded many computers around ourselves. Particularly embedded processors exist more than ten times those of desktop. However the embedded processors are not well linked together even though their processing and storage capabilities are doubling in 18 months for the next 10 years and the linkage has great potential for providing us with indefinite values. This is because in the past the cost for linking processors was not affordable, but the emerging technologies such as short-range radio communication including Bluetooth permit very flexible communications among the scattered embedded processors. Innovative middleware technology is definitely desired for linking and making embedded processors work collaboratively.

2. Basic Concept The AYA is the name for this type of middleware technology under the development within SDL, Hitachi (Originally, AYA is a Japanese word that means a fabric with twill weave for Kimono). Conventionally control systems in industrial as well as consumer areas are precisely defined before the actual operation so that all the players performing the necessary functions in the systems are usually determined beforehand. However, this predetermined nature of the systems limits opportunity for getting benefits from organizing the scattered processors. For this reason, in the AYA, the players in the systems are dynamically determined according to the context of the user as well as resource conditions around the user (for example, the system might be expected to provide the user with video conference functionality combining the TV and the mobile handset existing around the user without requiring dedicated conferencing facility). The AYA provides emergence of the system functions according to the user context and his/her environment.

3. The AYA Process and Architecture It is assumed that the user always carries his/her agent (named e-Guardian) that works for the surrounding environment. At the first step, the agent discovers processors (resources) surrounding the user, and then it defines the immediate environment and identifies the context of the user. According to the context and discovered resources, the agent selects the service scenario, which might be obtained from the Internet, and the players in the scenario are dynamically assigned to the resources. In this assignment, it is important to bridge the gap between the reality and the scenario as well as to resolve the possible conflicts among the agents existing within the Aug-28-2001, 22:24

- 14 -

immediate environment. Finally, the service scenario will be executed. In order to realize the AYA process, three-layered architecture is designed comprising with the communication layer that capsules existing variety of interfaces and protocols for the communication, the AYA layer that works for entity access, scenario selection, and player assignment, and the application layer. These layers work collaboratively in the resources within the immediate environment.

4. Current Status It is believed that the AYA provides basic middleware functionality for the coming ubiquitous computing era. The prototype system has already developed for stimulating the application development in the area of home appliances. The international joint efforts are initiated for the development. Within the OMG (Object Management Group), a new SIG named SDO (Super Distributed Objects) is formed and chaired by Hitachi, Sun Microsystems, and University of Tokyo in 2000 and currently elaborating to issue the White paper and RFP (Request for the Proposal)[1].

Reference [1] http://www.omg.org/

- 15 -

Aug-28-2001, 22:24

Multi-objective modeling of the decision making problems using SAS software Janusz Granat Institute of Control and Computation Engineering Warsaw University of Technology, and National Institute of Telecommunications, Warsaw, Poland

Keywords: Decision-making support, multi-objective problem analysis, data analysis

The SAS system is broadly used for providing information for the decision makers in the industry. It integrates the data on corporate level and consist of various modules for information processing and for statistical analysis. However, during the process of decision making we usually have to consider multiple objectives. The SAS system does not support multi-objective decision analysis. The importance of this approach is becoming more intense in decision making process. In multi-objective analysis we define the model of decision situation which consist of two parts: the substantive model of the decision situation and the model of the decision maker preferences. The substantive model describes the decision situation by analytical models. The modeling process is a complex task that might be supported by a modeling system like AMPL (A Modeling Language for Mathematical Programming). However, the model of preferences can not be built a priori. It is frequently implicitly constructed during several interactive iterations in a process of analysis of the decision situation. We will present a new module (developed for SAS system), which allows to use AMPL modeling language in the SAS environment for building the substantive model of a decision problem. This module enhances the analysis capabilities of the SAS software solutions for supporting decisions making process. Moreover, combining the power of information processing of the SAS systems with such new modeling capabilities allows to use multicriteria model analysis on the corporate level.

Aug-28-2001, 22:24

- 16 -

Distributed scalable optimization for intelligent sheet metal forming Manfred Grauer and Thomas Barth Institute for Information Systems, University of Siegen, D-57068 Siegen, Germany

Keywords: distributed optimization, simulation-based optimization, optimum process design, multi-stage metal forming

The economic necessity to reduce the overall design cycle time raises a growing demand for techniques supporting virtual prototyping in manufacturing. Virtual prototyping mainly aims at the reduction of cost and time by applying numerical simulation techniques to the design of parts as well as to their manufacturing process. Therefore, simulation replaces the costly manufacturing of tools and performing of test runs. Sheet metal forming by deep drawing is one of the dominating metal forming technologies applied in the automotive supplier industry. Therefore, realistic simulation of the deep drawing process is essential for virtual prototyping leading to optimum process design in sheet metal forming. In order to get an optimal design of a product or its production process, techniques from simulation-based optimization must be utilized. Due to the excessive runtime of numerical simulation of every single deep drawing process, an adequate optimization procedure has to distribute the workload of many hundreds or even thousands of simulations during the course of the optimization to resources in a parallel computation environment. For economic reasons, existing networks of workstations (NOW) are the favorable environment compared to expensive, badly scalable supercomputers. In this paper, the mathematical formulation of the problem of an optimal multi-stage deep drawing process and an adequate solution concept is given. The distributed Polytope-algorithm designed for the solution of simulation-based optimization problems in engineering is briefly introduced. Certain software engineering aspects important for the design of a software environment supporting virtual prototyping are discussed subsequently. Results obtained from the application of various distributed optimization approaches to design optimization problems in engineering are presented and first results of a prototypical optimization environment for deep drawing are provided.

- 17 -

Aug-28-2001, 22:24

Metasynthesis knowledge system for complex system Jifa Gu Japan Advanced Institute of Science and Technology, Ishikawa, Japan Xijin Tang Institute of Systems Science, Academy of Mathematical Science and System Science, Beijing, China

Keywords: Metasynthesis knowledge system, complex system

Prof. Qian ,Yu and Dai proposed the so-called Metasynthesis system approach to solve the Open giant complex systems in 1990. We wish develop it to connect with knowledge system. Usually we may classify all knowledge into three knowledge systems: Natural science, Social science and Engineering Science.If we put them simply altogether, we may call them muti-disciplinary; if these three sciences can interact each with other, we may call them interdisciplinary.Finally if we not only let them interact each with other and wish them be metasynthesized or integrated and synthesized. The knowledge can take the form like data and information, or fact knowledge.it can take the form of different models, which usually transform the data and information to the new data and information,or new knowledge. All mentioned knowledge are explicit one. but some knowledge can be taken as experience, which sometime are tacit knowledge. Finally during solving some problems people may create or innovate some new knowledge,or we call them knowledge creation and innovation. The metasynthesis knowledge system wish synthesize all kind of data and information, model and experience, and if necessary create some new knowledge and to combine data, information and experiences to solve complex problems.In this paper we will emphasize the model integration and opinion synthesis. There are three main approaches to model integration: Top-Down Architecture, Bottom-up and System Approach.Some examples of implementing model integration are mentioned, such as DOME(Distributed Object-based Modeling and Evaluation) by MIT, SWARM by Santa Fe Institute, DecisionNet by NPS et al.. Some applications of model integration in China were illustrated. For solving some complex system problems only the data, information and models are still not enough we wish directly ask the decision makers or experts to attend different discussions then to synthsize the different opinions or try to build consensus. Thus we must investigate the consensus building process, providing some approaches to support the consensus building, especially finding the computerized support for them. Some research works mentioned in this paper just as survey of literatures both in China and abroad or some are undergoing just in our recent research projects.

Aug-28-2001, 22:24

- 18 -

Mapping Environmental Effects of Agricultural Systems Kiyotada Hayashi National Agricultural Research Organization, Japan

Keywords: agricultural practices, concept maps, environmental indicators, health and ecological risks, multicriteria analysis

Agri-environmental interactions are causing major public concern over the appropriateness of rural policy making and farm management practices. There are, however, difficulties in understanding agri-environmental issues, since agriculture has both negative and positive impacts on the environment. Moreover, the environmental impacts are related to a wide range of scientific fields. In this case, visually displaying the environmental effects of agriculture will be a useful approach because it can represent the complicated relationships of agricultural practices to various indicators of human health and ecosystems. The purpose of this study is to enhance understandability of the agri-environmental interactions by applying a visual representation technique. Concept mapping, or cognitive mapping, is used for the analysis. First, the characteristic feature is outlined by comparing the maps with influence diagrams and Bayesian networks. Second, the impacts of agricultural practices are depicted on a concept map through conducting bibliographical survey. The concepts of human health risks and ecological risks play an important role in reviewing the impacts. Third, methodological implications are presented. That is, by mapping the great diversity of environmental impacts of agriculture, it becomes clear that in many cases the multiple attributes used for evaluating farm management practices, as well as environmental indicators developed for assessing agricultural policy decisions, can be considered as intermediate criteria and that the intermediacy may be a cause of a bias when the attributes or the indicators are applied to evaluations of decision alternatives. Although the current scientific knowledge necessary for building quantitative relationships between agricultural decisions and the environment is insufficient, qualitative relationships represented by concept maps will be useful for developing a prototype for integrated evaluation and for communicating with the public.

- 19 -

Aug-28-2001, 22:24

A Support System for Requirement Extraction from Bulletin Board System on WWW Ayako Hiramatsu Osaka Sangyo University Takafumi Nozaki Osaka University

Keywords: Requirement Extraction, Bulletin Borad System, Electronic Commerce

On the Interactive EC(Electronic Commerce) sites, to reflect consumers’ requirement in goods planning, the planners collect the consumers’ opinions through BBS(Bulletin Board System). Because of the active conversation among large number of people on the BBS, topics spread out like chain reactions and include ingenious ideas. However, too many opinions are written on the BBS as enormous text data and it is necessary to help goods planner to extract consumers’ requirement. We propose a support system for requirement extraction from the BBS. In this proposed system, opinions on a BBS are classified by topics and arranged on the 3-dimensional space that consists of a time axis and a topic plane. The topics plane can present the relevance among topics with 2 dimensions. The time axis can show the passage of conversation. With this system, planners can easily know how topics are changed through conversation. Furthermore, the process of classifying opinions includes a problem. Opinions with the spoken language have faults of words and cannot be accurately recognized only with words information. Therefore, a method for correcting the classification is proposed. This method uses the feature of the conversation structure on BBS. Comparing the classified topics by this correcting method and the manual operation shows the appropriateness of this method. To prevent the complex graphical user interface, we adapt the Fish-Eye that consists of 3 hierarchies. In the top hierarchy, it is applied to balls that show topic groups. In the second, it is applied to sizes of cards that show opinions. In the third, it is applied to letters in cards. To realize the third hierarchy, we propose a method for deciding what letters should be written in cards based on structures of conversation on the BBS. For evaluating the proposed system, requirement extracted from BBS with this system and with an usual Web browser are compared. This examination shows the proposed system can support for the extraction of ingenious ideas.

Aug-28-2001, 22:24

- 20 -

On some optimisation problems for imprecisely defined objective function Olgierd Hryniewicz Systems Research Institute, Warsaw, Poland

Keywords: fuzzy objective function, possibility measures, fuzzy preferences

In many practical optimisation problems the objective function has parameters that are imprecisely defined. Moreover, there exist additional preference requirements defined over a set of possible solutions that cannot be included neither in the objective function nor in the constraints. In the paper we consider a simple optimisation problem (

max f (x; p1, p2, ..., pk) x ∈ Xi , i = 1, .., m

(1)

where (p1 , p2, ..., pk) is the set of parameters, and Xi , i = 1, .., m are sets of admissible values of x. Imprecise information about the values of (p1 , p2, ..., pk) we present in form of fuzzy sets. Thus, the objective function becomes fuzzy. In the proposed optimisation procedure we find a certain reference value f (x∗ ; p1, p2, ..., pk)

(2)

where x∗ maximises f (x; p1, p2, ..., pk) over a set of all admissible values of x, and over a set of all possible values of (p1 , p2, ..., pk). Then, we find a set of such values of x, for which the value of the objective function cannot be distinguished (using some possibility measures) from the reference value. In the final step we look for the most preferred value of x that belongs to the set defined in the previous step, arriving at the most preferred solution of the initial optimisation problem.

- 21 -

Aug-28-2001, 22:24

Decision making for groundwater protection and remediation planning Stefan Kaden WASY Institute for Water Resources Planning and Systems Research Ltd., Berlin, Germany Manfred Grauer University of Siegen, Siegen, Germany

Keywords: Groundwater modeling, groundwater remediation, decision support, distributed optimization

Groundwater is an important resource for water supply and at the same time an important component of our aquatic environment. Many countries are entirely dependent on the groundwater resources. The threat to the sustainable use and management of groundwater is immense. It is frequently at risk from depletion and pollution. Groundwater systems as hidden and ”slow” components of the hydrological cycle are usually characterized by large time gaps between cause and consequences of systems changes. This is especially true for negative impacts and for remediation measures. Consequently there is a need for long-term, preventive monitoring, groundwater protection and remediation. In many areas there is a lack of information on the position and potential effect of dominant pollution sources within the catchment area of waterworks. Many of the effects are first noticed in the future when large quantities of water are already contaminated and remediation measures become extremely non cost-effective and cumbersome. Therefore it is highly important to gain information on the potential risk from different contaminated sites as early as possible and in a structured and comprehensive way, as well as to forecast potential risk. In case of contaminated sites with environmental risks strategies and technologies for remediation have to be developed and implemented. The range for such solutions is wide-spread, depending on local conditions. In case of long-lasting contaminations remediation or active groundwater protection will be in general long-lasting as well. Consequently cost-effective (optimal) solutions are required. For risk assessment, groundwater protection and remediation planning etc. numerical groundwater modeling is state-of-the-art. In the last decade groundwater modeling has changed from a expert tool, applied mainly by specialists to special problems, to a widely distributed instrument in groundwater hydrology, water management and environmental protection. This development was forced by advances in computer systems and modeling techniques, as well as by new challenges in practical applications. The most obvious example of new achievements is the 3D modeling of coupled groundwater flow, mass and, in part, heat transport processes. At the same time new tools in pre- and post-processing have been developed and implemented in order to handle large, complex tasks. Consequently, 3D-groundwater modeling has become almost common practice. One of the most advanced simulation systems is FEFLOW.1 In planning processes simulation models are used in the framework of scenario-analysis in a kind of ”trial-and-error” mode. In case of complex problems such approach is time consuming and does in general not guarantees cost-effective optimal technical solutions. Consequently the application of mathematical optimization methods 1

FEFLOW is registered trademark of WASY Ltd., see http://www.wasy.de/english/produkte/feflow/

Aug-28-2001, 22:24

- 22 -

would be desirable. In the past such methods in groundwater modeling have had little practical relevance because of extremely high computing requirements in case of real-world complex problems. Jointly with the University of Siegen (see e. g. Grauer et al., 2000) a new optimization technology for groundwater problems has been developed, coupling the groundwater modeling system FEFLOW with the OpTix optimization system via the FEFLOW interface manager IFM, employing a distributed solution concept. The actual state of this technology will be shortly introduced. The major focus of the paper will be given to a case study, concerning an industrial area in former East Germany (Brandenburg). Before German reunification, chemical industry here caused extensive pollution of soil and partial pollution to the groundwater. Relevant chemical parameters are heavy metals, ammonia and organic pollutants (DMA). Close to the industrial area a waterworks (groundwater) for public water supply is located. The water protection zone of the waterworks comes into contact with the industrial area. In order to reduce the risk of pollution for the waterworks a hydraulic barrier (extraction wells) has been established between the industrial area and the waterworks. The hydrogeological conditions are relatively complicated. In the first stage of the study a coupled 3D groundwater flow and mass transport model was developed and applied for risk analysis and design of the groundwater monitoring system. The objective of the second stage was the optimization of groundwater protection and remediation strategies. For that purpose the above introduced coupling of the groundwater model FEFLOW with OpTix is used. Practical results and experiences will be presented.

References [1] Grauer, M., Barth, Th., Kaden, S. and Michels, I. (2000): A scalable algorithm for distributed solution of simulation-based optimization in groundwater management. Paper presented at Konan-IIASA Joint Workshop, 6.-8. September, Laxenburg/Austria.

- 23 -

Aug-28-2001, 22:24

Cost-benefit-risk analysis of innovation project - a case study. Lech Kru´ s Systems Research Institute, Polish Academy of Sciences, 01-446 Warsaw, Poland

Keywords: modeling, decision support, innovations, risk, financial analysis

The paper deals with problems of financial analysis of risky, innovation projects. Each project requires resources concentrated in an investment time period. The resulting investments cost is compared to benefits obtained in a given time horizon after successful accomplishment of the project. The time of the project accomplishment is considered as an important control parameter in the analysis. The project finished in the relatively short time can give high financial return but there is also high risk that it can fail. The innovative projects are analyzed and evaluated with respect to the expenditures required, to the expected financial benefits and to the estimated risk. Some numerical algorithms and an experimental system for quantitative estimation of the above parameters are proposed. Proposed ideas have been tested on an example of real research project that has been realized in the waterworks in the city of Rzeszow (Poland). The project has been finished in elaboration and implementation of a new (first in Poland) software (computer based system) supporting control of municipal water supply system. The considered project consists of several stages, like: preparation of waterworks net map in a GIS system, construction of hydraulic model of the net, monitoring of the real system, model calibration, elaboration of software for optimization and control, system implementation and others. Structure of the project takes a form of network programming graph. All the stages have to be finished with success to accomplish the overall project. It is assumed that each stage takes the form of some number of trials - tests. Each trial takes a basic period of time and is characterized by a perceived probability of success. Using the Bernoulli scheme a probability can be calculated that the stage will be finished successfully after a given time. An algorithm is proposed calculating probability of success of the overall project for given project structure and for assumed project accomplishment time. Using data referring to the discussed project and some experts’ estimations several quantities like investment cost, rate of financial return and some measures of risk are calculated in relation to the time of the project accomplishment. A decision support system is considered aiding cost benefit analysis including risk measures.

Aug-28-2001, 22:24

- 24 -

A Multi-Objective Framework for Evaluation of Simulated Water Management Alternatives. Daniel Loucks Cornell University, USA

Keywords: ecology, Everglades, Great Lakes, hydrology, modeling, restoration, social issues

A number of major water resource ecosystem restoration projects are currently underway in the United States. One, and arguably the longest and most expensive, is focused on the Everglades Region, a unique ecosystem in southern Florida. Another is focused on the Lake Ontario and St. Lawrence River Basin that both Canada, the US, and a number of native tribes share. The Everglades restoration project is estimated to cost some $8 billion over a period of about 50 years. It is a multi-agency effort in a State of considerable interest to the current administration in Washington. The question is how better to manage the hydrology in the region in an effort to restore the unique ecosystem but at the same time provide reliable water supplies and flood protection to communities located along the Southeastern Florida coast. The Lake Ontario and St. Lawrence River restoration project is in its first of 5 years. It is funded by both the Canadian and US governments through the International Joint Commission that oversees the management and use of international waters along the Canadian US border. It is motivated by the increasing recognition of the damage caused to the environment and ecology of the Basin if those aspects are ignored in a quest to satisfy all other economic interests. These interests including recreational boating, navigation, hydropower production, and prevention of shore line erosion. These two projects are not only a challenge to scientists attempting to understand the physical and biological processes affecting the unique hydrology and ecology but also to planners and decision makers dealing with the social dynamics of the people living in the area. The outcomes of these ecosystem restoration efforts will largely be determined by the land use decisions and social activities of these people over the next several decades. They may also be influenced by political decisions made far outside the region, such as in Ottawa and Washington, and by factors such as climate change (e.g., sea level rise) as well. In this complex physical and social environment, scientists from private and public agencies are working together with all concerned stakeholders to plan and manage these two restoration projects. Models are being developed and used to estimate the various impacts that may result from any plan or management policy. As expected, there exist conflicts among various stakeholders. If there was ever a challenge for those involved in building models and associated decision support systems for impact prediction and for communicating information to multiple stakeholders having quite different interests and concerns all in an effort to obtain some consensus or shared vision of what should be done, and why these projects provide some. Successful ecosystem restoration will depend upon an integrated approach that recognizes and understands the interrelationships and interactions between healthy sustainable natural ecosystems and the social and economic systems that impact them. These systems are dynamic, uncertain and unpredictable. As the natural system restoration activities increase in the near future, its planners and managers must take into account both the biophysical as well as the - 25 -

Aug-28-2001, 22:24

social and political systems that operate in the basins. As complex and interdependent as the ecological or hydrological systems, so are the human communities that are of multiple cultures, beliefs, attitudes, institutions, economies, land uses and histories. A scientific framework that collects, analyzes, disseminates, and integrates cultural and socio-economic data with ecological and hydrological modeling is needed to provide the basis for future research, funding and eventual policy decisions. This paper will discuss the approach adapted in both studies involving all stakeholders in the process of identifying and then evaluating the multiple performance indicators associated with alternative water management policies and practices.

Aug-28-2001, 22:24

- 26 -

Eco-Efficiency Analysis of an Economy Mikulas Luptacik Vienna University of Economics and Business Administration, Department of Economics, A-1090 Wien, Austria

Keywords: macroeconomic production function, data envelopment analysis, eco-efficiency, Pareto-Koopmans efficiency, goal programming

In the paper the efficiency of an economy is defined by the linear program based on the inputoutput model with make and use tables that maximizes the level of domestic final demand given its proportions - for given amounts of primary factors, labour and capital. For the notion of the eco-efficiency this concept is extended by taking into account pollutants or the undesirable outputs produced in the economy. The degree by which a net output vector - for given stocks of capital and labour and for given environmental standards - could be extended, is a measure for the eco-efficiency of an economy. In order to obtain Pareto-Koopmans efficiency a so-called slack-based-measure of the eco-efficiency is proposed and illustrated by a numerical example.

- 27 -

Aug-28-2001, 22:24

Advanced Modeling Support: A Draft Requirement Analysis Marek Makowski International Institute for Applied Systems Analysis, A-2361 Laxenburg, Austria Keywords: modeling paradigms, decision support systems, object-oriented programming, robustness, multi-criteria model analysis, model management, distributed systems.

1. Background A growing number of organizations in industry, public administration, business consultants and knowledge institutes need and use various modeling techniques. Many commonly used models can be classified as Algebraic Models (AM), i.e. set of relations (such as equations or inequalities) between quantitative inputs (decisions) and outputs (performance indices) that measure consequences of implementation of decisions. Such models are used for model-based decision support, which makes it possible to find better solutions to complex problems than those that could be found without model-based problem analysis. The relations between the basic concepts pertinent to using models for decision-making support, namely, decision variables, external decisions, outcome variables, and a mathematical model are illustrated in Figure 1.

U s e r x z

M a th e m a tic a l m o d e l y = F (x ,z ) y

Figure 1: An algebraic model y = F (x, z) used for model-based decision-making support represents relations between decisions (inputs) x , external decisions (inputs not controlled by the user) z , and measures of consequences (outcomes) y . AMs are used whenever rational decisions require various analyses of a large amount of data and logical relations, in a wide range of application domains in industry, policy-making, science, research and education, where analyses cannot (or should not) be done based only on an experience or intuition of a decision maker or his/her advisors. While each problem requires a specific model, there are many common analytical features of AMs and many commonly applicable methods and tools for model analysis that justify the development of methods and tools that can be shared and reused for a wide range of applications. The modeling needs of AMs are supported by general purpose modeling environments such as GAMS, AMPL, AIMMS, and object-oriented modeling systems (e.g. ASCEND). Expertise and tools have been also developed with a focus on various modeling paradigms, either specific for a preferred type of analysis Aug-28-2001, 22:24

- 28 -

(e.g. optimization- or simulation-based) or specialized for a type of problem, e.g. APS (Advanced Planning and Scheduling Systems). These tools have been developed over the years and will continue to be developed and used for applications that can be adequately supported by a corresponding modeling paradigm. However, the rapidly growing complexity of decision problems (caused by globalization and increasing competitiveness), results in growing demand for comprehensive analysis of systems. Models can potentially provide better solutions for such problems, but this potential cannot be met by incremental improvements in modeling methods and tools.

2. Modeling paradigms Various methods and tools, often referred to as modeling paradigms, are applied to different but mutually related modeling activities, which are elements of the modeling cycle composed of: • model specification (i.e. a symbolic definition of the underlying AM), • data collection, • model and data verification, • model analysis, • model documentation, • model maintenance. For the sake of the following we combine the above activities in two groups: • model development (composed of all modeling activities but model analysis), • model analysis. From the user point of view the model development activities are of technical nature as long as the user can trust that the relations between input and output variables (outlined in Fig. 1) are adequately represented, and all pertinent methods of model analysis are supported. However, the model development activities of any complex model require huge amount of resources, and therefore improving efficiency of the model development processes is of a critical importance. Model analysis paradigms can be grouped into the following sets of approaches: • Simulation-based, where decision variables are inputs and goals are outcomes. Therefore this technique is good for exploring the intuition of a DM, not only for verification of the model, but also for providing a DM with information about the consequences – typically represented by values of goals and constraints – of applying certain decisions. One can also consider simulation as an alternative-focused method of analysis that is oriented towards examining given alternatives. • Optimization-based, which can be considered as a goal-oriented (value-focused) approach that is directed towards creating alternatives. Optimization is driven by formulating a single objective in single-criterion optimization, or several objectives in multi-criteria optimization, and looking for values of decision variables that optimize the value of the specified objective(s). Therefore, goals are the driving force and the values of decision variables are the outcomes. Traditional approaches to model analysis have been based either on simulation or on classical formulations of single-criterion optimization. A summary of these approaches and their limitations is helpful for understanding the advantages of modern model analysis methods which extend and combine these approaches into multi-criteria model analysis paradigms, which include several useful techniques such as: • soft simulation, • soft constraints. Unfortunately, use of various model analysis techniques is a particular model is hardly possible because the modeling technology is now at the stage where data processing technology was before the development of DBMS. Data processing was revolutionized by the transition from file processing to DBMS. The data management revolution occurred in response to severe problems with data reusability associated with file-processing approaches to application development. The need to share data resources resulted in the development of DBMS, which separated the data - 29 -

Aug-28-2001, 22:24

from the applications that used the data. Advances in database technology have been propelled in the past decades primarily by the development, refinement, and eventual implementation of the relational data model. DBMSs make it possible to efficiently share not only databases but also tools and services for data analysis that are developed and supplied by various providers and made available on computer networks. The challenge of creating advanced modeling support is to achieve a major break-through in modeling technology that will making possible to efficiently satisfy the rapidly growing needs for advanced modeling support that exploits the shareable knowledge contained in data, models and various tools for model development, analysis and maintenance.

3. Advanced modeling support Advanced modeling support should have the following basic features: • Support the full modeling cycle, including specification, generation, validation, analysis, maintenance, and documentation. • Use only one source of a model specification for the whole modeling cycle, i.e. for model generation, maintenance, various types of analysis, for producing a human-readable documentation, and for providing solvers with information that allows exploiting the model structure for more efficient problem solving. • Couple model documentation with a version control, enabling automatic generation of humanreadable documentation of history of changes, data used, various views on the model structure, data used, and results of various methods of model analysis. • Facilitate separation of model data, specification and analysis, which is a prerequisite of good modeling practices. • Support for using a model as part of a system of models, and for coupling independently developed models, thus allowing for efficient integrative and multidisciplinary studies, and for re-use of model-elements of proven quality. This requires checking not only correctness of syntax but also check at least basic semantic consistency. • Provide access to various tools that support model analysis (solving) based on all relevant advanced modeling paradigms (e.g. simulation, optimization, multicriteria model analysis, soft simulation) thus enabling more complete model analysis than can be provided by any single paradigm. • Adapt proven DBMS technology, familiar to many analysts, to integrate modeling activities with management of data used for models. • Support efficient collaborative work by providing reliable and secure access to shared resources (models, data bases, software tools, hardware resources) on heterogeneous hardware at distant locations and seamlessly integrated with the information system infrastructure of the organization or even a network of organizations. In order to provide such features of an advanced modeling environment one needs a unifying structured model representation that will provide the necessary functionality. The existing formats of model representation focus on a specific modeling paradigm. The example of almost 50 years old MPS standard for a representation of LP problems clearly shows advantages of an agreed representation of a problem: there are dozens of tools (that can be used on heterogeneous hardware) for analysis of problems that are specified in the MPS standard. Such problems come from many areas of applications, and therefore many various tools for analysis of LP problems can be widely reused. However, the MPS standard is restricted to a specific type of models, and to single-criterion optimization-based approach to model analysis. Similar type of limitations apply to other commonly used formats for a model specification used by various modeling environments. Therefore a new common model representation is needed for providing all the information necessary for providing the outlined above features of modeling environment. Such a structured unifying model representation can be designed and built using the objectoriented approach as a collection of classes representing the needed types of objects. The inAug-28-2001, 22:24

- 30 -

heritance principle makes it possible to efficiently treat particular types of objects while handling common tasks by base classes. To illustrate the concept let’s consider one type of the needed objects, namely the Variable. The basic type of such object should contain the following attributes: name (short name used in the generated model), description (used for the documentation), lower and upper bounds, units, various needed classifications of the variable (e.g. type [from the user point of view]: decision, outcome, auxiliary, etc; type [from mathematical programming point of view]: continuous, binary, linear, non-linear, etc.). Vectors of objects (variables or their aggregates) can be used for handling aggregates of variables (e.g. representing trajectories or various types of collections). The base class will be equipped with functions providing a functionality common for all variables, while inherited classes will contain functions needed for each specialized type. The definition of the relations between variables can be implemented in a similar way. Such an implementation will assume a symbolic definition of the relations, with appropriate links to the data stored in data bases. Hence, a model definition will be composed of two interlinked parts: • Model specification, which defines attributes of variables, and uses algebraic formulas for defining relations between variables. • Data (stored in data bases) that is needed for defining parameters used in the model specification. Such an approach will not only make it possible to meet the above listed requirements but also will allow for using the proven DBMS tools for both model specification and for data used for model instantiation.

4. Concluding remarks This paper outlines the requirement analysis of an advanced modeling environment, and discusses a possible approach to the design and implementation of one of a key element of such an environment, namely a unifying common model representation. The author will appreciate comments on the completeness of the presented list of desired futures of an advanced modeling environment, and on the outlined idea of a structured model representation.

Acknowledgment Several ideas presented here have resulted from many discussions and joint activities of the author with A. Beulens, A. Geoffrion, J. Granat, H. Scholten, H-J. Sebastian and A.P. Wierzbicki.

- 31 -

Aug-28-2001, 22:24

Mobile DSS: A Medical Teletriage System for PalmPilot Wojtek Michalowski University of Ottawa Steven Rubin Children’s Hospital of Eastern Ontario ´ ski Roman Slowin Pozna´ n University of Technology Szymon Wilk Pozna´ n University of Technology

Keywords: triage, medical decision making, wireless communication, PDA devices

Using data collected at the emergency room of the Children’s Hospital of Eastern Ontario and applying a variety of the methodologies we have identified the most relevant clinical attributes and signs for triage of children with abdominal pain. This information was used to develop a clinical algorithm for evaluation of the abdominal pain patients presenting in the emergency room of a teaching hospital. After consulting with medical professionals, the algorithm was used to develop a teletriage system implemented on a PDA device such as PalmPilot. The prototype system was designed according to the principles of client-server architecture. The client module runs on the PDA devices with PalmOS 3.5 (or later), such as 3Com family of Palm PDAs, and the server is implemented on the PCs under Windows NT/2000. Communication between PDA devices and personal computer is performed using a wireless infrared (IrDA) port or a cradle adapter. The PDA client is responsible for the tasks described below. Collecting patient data: The client is used for entering data about examined patients. The data is stored in a local database. Supporting triage decision: The client uses clinical algorithm (generated on a server side) to make triage decisions using patients information stored in a local database. Synchronizing data with other clients: The client sends data from a local database to other clients using wireless IrDA port. It is also capable of receiving data from other clients and storing them in a local database. Transferring data to a server: The client is able to transfer contents of a local database to server using cradle adapter. The PC server is resposible for the following tasks. Managing and synchronizing a centralized database: The server manages a centralized database containing all the patients records collected and sent by the PDA clients. Periodic analysis of the data stored in a centralized database: The sever periodically analyzes the data stored in the centralized database so the clinical algorithm can be further refined. Sending the updated clinical algorithm to the PDA clients: The server updates the clinical algorithm residing on the PDA clients through a HotSync function of a PDA. A teletriage process supported by the system starts with entering patients data (including unique PIN) at a moment of patients admission to the emergency room. The system uses inforAug-28-2001, 22:24

- 32 -

mation available at that particular stage of patient management to suggest a triage. It may also be used for entering any additional information about patients condition as deemed necessary by the attending health care professional. Once care of a patient is being transferred to another member of a medical staff, all information gathered so far (including triage recommendation) can be beamed to another PDA with the wireless IrDA port. While new information about a patient becomes available, the system can be consulted again and it will use most current information to re-evaluate the latest triage decision. At the end of the process, all pertinent patients record can be transferred to the PC server thus either creating or updating patients record in a centralized database. There are several advantages associated with using portable computing devices such as PalmPilot as a platform for system’s implementation. The PDA is a mobile device that can be carried by a user and thus does not restrict location or time when a teletriage system can be used. Moreover, it ”moves” with a nurse or physician allowing to collect patient’s data and to provide advice at a bedside. Currently we are designing a series of pilot tests of the teletriage system in teaching hospitals in Ottawa and Hamilton (Ontario), in Calgary (Alberta) and in the community health center in the Ottawa-Carleton region in Ontario.

- 33 -

Aug-28-2001, 22:24

Knowledge Management for Complex Systems Modeling Yoshiteru Nakamori Japan Advanced Institute of Science and Technology, Tatsunokuchi, Japan Yoshikazu Sawaragi Japan Institute of Systems Research, Kyoto, Japan

Keywords: Systems approach, systems integration, systems modeling, knowledge management and creation.

Most systems thinkers today might agree the methodological pluralism or complementarism discussed in the fields of sociology and organizational studies. The pluralism means that it is necessary to develop and employ a wide range of heterogeneous methodological devices so as to investigate and deal with the complexities. The complementarism suggests that those devices can communicate with and support each other as some methods are good at tackling hard problems while others at soft issues, given that human problems are conditioned and constituted by both hard problems and soft issues. Knowledge science is a science to develop methodologies and methods with which we can convert subjective ideas into justifiable or hopefully reliable ones. In the course of this converting process, education of people and refinement of information channels play a crucial role. The methods and ideas in knowledge science are those which guarantee justifiable trans-disciplinary knowledge exchange, utilizing information and communication technologies in addition to faceto-face interactions of people. The most reliable knowledge source is the scientific investigation that produces public knowledge. This is objective, unique, universal, and repeatable. One the other hand, knowledge obtained in social science includes meanings given by people inevitably, which are wisdom-based knowledge, insight-based knowledge, and experienced-based knowledge. These kinds of knowledge are subjective, vague, ambiguous, and circumstantial. We introduce a research activity that aims at developing a knowledge systems methodology for integration, management and creation of these different types of knowledge. This is a research in knowledge science that creates justified true belief, or systemic knowledge of complex systems. This study uses approaches in social and natural sciences complementarily. The first one is scientific approach that uses physical laws, data analysis, etc. The second is information science, especially a large-scale computer simulation and the networking technology. The third is a method in social science, which is related to forming partnerships among social members. The fourth is knowledge science that integrates, transform, and create knowledge. Finally, systems science is used to manage these different approaches. In total, this is a support system for all relevant people to create a sustainable society. In this presentation, after introducing some methodological aspects, we will discuss the possibility of agent-based simulation to deal with complex systems.

Aug-28-2001, 22:24

- 34 -

Support Vector Machines using Multi-Objective Linear Programming Hirotaka Nakayama and Takeshi Asada Department of Information Science and Systems Engineering, Konan University, Kobe 658, Japan

Keywords: Support Vector Machines, Linear Programming, Multi Objective Programming, Machine Learning

Support Vector Machines (SVMs) are attracting many researchers’ interests as a powerful method for pattern recognition. SVMs are usually formulated as Quadratic Programming (QP). Using other distance function, SVMs can be also formulated as Linear Programming (LP). In general, SVMs tend to make overlearning. In order to overcome this difficulty, the notion of soft margin is introduced. In this event, it is difficult to decide the weight for slack variables reflecting soft margin. In this paper, Soft Margin method is extended to Multi Objective Linear Programming(MOLP). It will be shown throughout several examples that SVMs reformulated as MOLP can give a good performance in pattern classification.

- 35 -

Aug-28-2001, 22:24

Conditional Center: A New Solution Concept for Location Problems Wlodzimierz Ogryczak Warsaw University of Technology, Institute of Control & Computation Eng., Warsaw, Poland Mriusz Zawadzki Warsaw University, Institute of Informatics, Warsaw, Poland

Keywords: Location, efficiency, equity, median, center, conditional center

The generic location problem that we consider may be stated as follows. There is given a set I = {1, 2, . . . , m} of m clients (service recipients). Each client is represented by a specific point in the geographical space. There is also given a set Q of location patterns (location decisions). For each client i (i ∈ I) a function fi (x) of the location pattern x has been defined. This function, called the individual objective function, measures the outcome (effect) yi = fi (x) of the location pattern for client i [5]. In the simplest problems an outcome usually expresses the distance. However, we emphasize to the reader that we do not restrict our considerations to the case of outcomes measured as distances. They can be measured (modeled) as travel time, travel costs as well as in a more subjective way as relative travel costs (e.g., travel costs by clients incomes) or ultimately as the levels of clients dissatisfaction (individual disutility) of locations. In typical formulations of location problems related to desirable facilities a smaller value of the outcome (distance) means a better effect (higher service quality or client satisfaction). This remains valid for location of obnoxious facilities if the distances are replaced with their complements to some large number. Therefore, without loss of generality, we can assume that each individual outcome yi is to be minimized. Frequently, one may be interested in putting into location model some additional client weights wi > 0 to represent the service demand. Integer weights can be interpreted as numbers of unweighted clients located at exactly the same place (with distances 0 among them). For initial theoretical considerations we will assume that the problem is transformed (disaggregated) to the unweighted one (that means all the client weights are equal to 1). Note that such a disaggregation is possible for integer as well as rational client weights, but it usually dramatically increases the problem size. Therefore, we consider solution concepts which can be applied directly to the weighted problem. A host of operational models has been developed to deal with facility location optimization. Most classical location studies focus on the minimization of the mean (or total) distance (the median concept) or the minimization of the maximum distance (the center concept) to the service facilities. Both the median and the center solution concepts are well defined for aggregated location models using client weights wi > 0 to represent several clients (service demand) at the same geographical point. Exactly, for the weighted location problem, the median solution concept is defined by the minimization of the objective function expressing the mean (average) outcome but it is also equivalent to the minimization of the total outcome. The center solution concept is defined by the minimization of the objective function representing the maximum (worst) outcome and it is not affected by the client weights at all. Aug-28-2001, 22:24

- 36 -

The median solution concept is primarily concerned with the spatial efficiency. As based on averaging, it often provides solutions where remote and low–population density areas are discriminated in terms of accessibility to facilities, as compared with centrally situated and high–population density areas. For this reason, while locating public services the center solution concept is usually applied to minimize the maximum distance (travel time) between any consumer and the closest facility. As the minimax objective primarily addresses the geographical equity issues, this approach is of particular importance in spatial organization of emergency service systems, such as fire, police, medical ambulance services, civil defense and accident rescue. The center approach is consistent with the Rawlsian [9] theory of justice, especially when additionally specified as the lexicographic center [6]. On the other hand, locating a facility at the center may cause a large increase in the total distance thus generating a substantial loss in spatial efficiency. This has led to a search for some compromise solution concept. Halpern [1] introduced the λ–cent–dian as a parametric solution concept based on the convex combination of the two objectives representing the minisum and the minimax approaches. Unfortunately, due to the lack of convexity, the solution concept of λ–cent–dian may fail to provide a compromise location in the case of discrete problems [7]. In this paper we introduce an alternative compromise solution concept of the conditional center. It is a parametric generalization of the center concept taking into account the number of clients (the portion of demand) related to the maximum outcomes (distances). Namely, for a specified number of clients k (or portion of demand β) we take into account the entire group of the k (β portion) maximum outcomes and we consider their average as the conditional maximum outcome. We call a conditional center every location pattern which minimizes the corresponding conditional maximum outcome. According to this definition the concept of conditional center is based on averaging restricted to the group of the worst outcomes. For number k decreasing to 1 (or β approaching 0) the conditional maximum tends to the standard maximum outcome and the conditional center becomes the standard center. On the other hand, for k approaching the number of all clients m (or β approaching 1) the corresponding conditional center tends to the median. One of the disadvantages of the minimax approach to location problems is that it is too crude and many quite different feasible solutions may be optimal with respect to the minimax criterion. While using standard algorithmic tools to identify the minimax solution, one of many solutions is selected randomly. It causes that the centers are highly unstable. Furthermore, it often turns out that the distribution of spatial units in relation to the location of facilities may make the minimax criterion partially passive. It arises, for instance, when an isolated spatial unit is located at a considerable distance from all the locations of facilities. Minimization of the maximum distance is then reduced to the minimization of the distance of that single isolated spatial unit leaving other location decisions unoptimized. The concept of conditional center, due to averaging within the group of the worst outcomes, reduces this flaw of the center approach. While locating public facilities, the issue of equity is becoming important. Equity is, essentially, an abstract socio–political concept that implies fairness and justice. Nevertheless, equity can be quantified and the notion of equitable multiple criteria optimization is well defined [3]. Several equitable approaches to location problems has been developed and analyzed [2,5,8]. The center concept is an equitable approach [6]. The concept of conditional center preserves this property allowing simutaneously for wider modeling of equitable preferences with the parameter. The paper gives a formal definition of the conditional center solution concept. We show that, similar to the standard center, the conditional center may be found by solving an optimization problem with a linear objective and a number of auxiliary linear inequalities. Further, we discuss equitable properties of the conditional center. The conditional maximum outcomes turn out to be closely related to the (absolute) Lorenz curve which implies the equitable properties of the corresponding solution concept. Finally, we report some results of our initial computational experience with the conditional center concept in comparison to the classical solution concepts. The conditional center is shown to be much more effective in modeling various compromise - 37 -

Aug-28-2001, 22:24

location preferences than the classical cent-dian approach [1] (especially, in the case of discrete location problems). Minimization of the conditional maximum, similar to the standard minimax approach, may be modeled with a number of simple linear inequalities. Our limited experiments with the use of a simple general purpose MIP code show that the conditional center usually needs a computational effort larger than that for the median but smaller than that for the center. Certainly, largescale real-life location problems will require some specialized algorithms. Therefore, research on efficient specialized algorithms for conditional centers of various specific types of location problems should be continued, or rather initiated. This paper has focused on location problems. However, the location decisions are analyzed from the perspective of their effects for individual clients. Therefore, the general concept of the proposed conditional maximum outcome can be used for optimization of various systems which serve many users. In particular, it offers a new promising approach to the equitable resource allocation problems [4]. Moreover, uniform individual objectives may be associated with some events rather than the physical users, like in many dynamic optimization problems where uniform individual criteria represent the same outcome for various periods.

References [1] J. Halpern, Finding minimal center-median convex combination (cent-dian) of a graph, Management Science 24 (1978) 534–544. [2] M.M. Kostreva and W. Ogryczak, Equitable approaches to location problems, in: Spatial Multicriteria Decision Making and Analysis: A Geographic Information Sciences Approach, ed. J.-C. Thill, Ashgate, Brookfield, 1999, pp. 103–126. [3] M.M. Kostreva and W. Ogryczak, Linear optimization with multiple equitable criteria, RAIRO Recherche Op´erationnelle 33 (1999) 275–297. [4] H. Luss, On equitable resource allocation problems: a lexicographic minimax approach, Operations Research 47 (1999) 361–378. [5] M.T. Marsh and D.A. Schilling, Equity measurement in facility location analysis: a review and framework, European Journal of Operational Research 74 (1994) 1–17. [6] W. Ogryczak, On the lexicographic minimax approach to location problems, European Journal of Operational Research 100 (1997) 566–585. [7] W. Ogryczak, On cent-dians of general networks, Location Science 5 (1997) 15–28. [8] W. Ogryczak, Inequality measures and equitable approaches to location problems, European Journal of Operational Research 122 (2000) 374–391. [9] J. Rawls, The Theory of Justice, Harvard University Press, Cambridge, 1971.

Aug-28-2001, 22:24

- 38 -

A new methodology for analysis of semiqualitative dynamic models with constraints Juan A. Ortega and Jesus Torres

and Rafael M. Gasca

Department of Informatics, University of Seville, Spain

Keywords: Methodology, Analysis of systems, Semiqualitative approaches

In real systems studied in science and engineering, it is difficult to find mathematical models that represent them in an appropriate way. The modeling techniques should obviate certain aspects of the system. The simulation of these models helps us to study the evolution of the real system. On the other hand, it is not always possible to obtain a mathematical model of a system. Thus, it is necessary to apply other techniques in order to carry out its study. A possibility may be placing data sensors in the real system. The analysis of these data allows us to study the system evolution. Knowledge about dynamic systems may be quantitative, qualitative, and semiqualitative. When these models are studied all this knowledge should be taken into account. Different levels of numeric abstraction have been considered: purely qualitative, semiquantitative and quantitative. Different approximations have been developed in the literature when qualitative knowledge is taken into account: distributions of probability, transformation of non-linear to piece-wise linear relationships, MonteCarlo method, fuzzy sets, causal relations, and combination of all levels of qualitative and quantitative abstraction.

Real System

Modeling

Semiqualitative model with constraints Transformation techniques

Quantitative Model

Time-series Database Classification

Queries

Answers Labelled Database Learning

System Behaviour

Figure 2: The proposed structure of semiqualitative modeling.

- 39 -

Aug-28-2001, 22:24

In this paper, a new methodology is proposed in order to study semiqualitative models of dynamic systems (see Figure 2). It is also described a formalism to incorporate qualitative information into these models. This qualitative information may be composed of: qualitative operators, envelope functions, qualitative labels and qualitative continuous functions. This methodology allows us to study all the states of a dynamic system: the stationary and the transient states. It also allows us to obtain behaviours patterns of semiqualitative dynamic systems. The main idea of the methodology follows: a semiqualitative model is transformed into a family of quantitative models. Every quantitative model has a different quantitative behaviour, however they may have similar qualitative behaviours. A semiqualitative model is transformed into a set of quantitative models. The simulation of every quantitative model generates a trajectory in the phase space. Searching for similar patterns in such database is essential, because it helps us in predictions, hypothesis testing and, in general, in data mining and rule discovery. A language to carry out queries about the qualitative and temporal properties of this timeseries database is also proposed. This language allows us to study all the states of a dynamic system: the stationary and the transient states. The language is also intended to classify the different qualitative behaviours of our model. This classification may be carried out according to a specific criterion or automatically by means of clustering algorithms. The semiqualitative behaviour of a system is expressed by means of hierarchical rules obtained by means of machine learning algorithms. Techniques of Knowledge Discovery in Databases (KDD) are applied to carry out the analysis of dynamic systems with qualitative and quantitative knowledge. The term KDD is used to refer to the overall process of discovering useful knowledge from data. The problem of knowledge extraction from databases involves many steps, ranging from data manipulation and retrieval to fundamental mathematical and statistical inference, searching and reasoning. Although the problem of extracting knowledge from data (or observations) is not new, automation in the context of databases opens up many new unsolved problems. KDD has evolved, and continues to evolve, from the confluence of research in such fields as databases, machine learning, pattern recognition, artificial intelligence and reasoning with uncertainty, knowledge acquisition for expert systems, data visualization, software discovery, information retrieval, and high-performance computing. KDD software systems incorporate theories, algorithms, and methods from all of these fields. The term data mining is used most by statisticians, database researchers and more recently by the business community. Data mining is a particular step in the KDD process. The additional steps in KDD process are data preparation, data selection, data cleaning, incorporation of appropriate prior knowledge and proper interpretation of the results of mining ensure the useful knowledge is derived from the data. On the other hand, historical, temporal and spatial databases have been profusely studied in the bibliography. Specific applications include financial, marketing and production time series, such as stock prices, sales numbers, and also scientific databases with time series of sensor data. For example, in weather data, geological, environmental, etc. In this paper, we are interested in time-series databases corresponding to the evolution of semiqualitative dynamic systems. Databases theories and tools provide the necessary infrastructure to store, access, and manipulate data. In this paper, a new way to study dynamic systems that evolve in the time is proposed merging data mining, time-series and databases engine. The proposed perspective tries to discover the underlying model in the database by means of a query/classification language. The semiqualitative behaviour of a system is expressed by means of hierarchical rules obtained by means of machine learning algorithms. The completeness property of the proposed methodology is characterized by means of statistical means. A theoretical study about the reliability of the obtained conclusions is carried out. The methodology is applied to a logistics growth model with a delay.

Aug-28-2001, 22:24

- 40 -

Rough Set Theory A New Approach to Reason From Imperfect Data Zdzislaw Pawlak University of Information Technology and Management, 01-447 Warsaw, Poland

Keywords: Rough Sets, data analysis

Rough set theory can be viewed as a new non standard mathematical tool for imperfect data analysis. The theory has proved its usefulness in many domains, such as decision support, engineering, environment, banking, medicine and others. Rough set philosophy is founded on the assumption that with every object of the universe of discourse we associate some information (data, knowledge). Objects characterized by the same information are indiscernible (similar) in view of the available information about them. The indiscernibility relation generated in this way is the mathematical basis of rough set theory. Any set of all indiscernible (similar) objects is called an elementary set, and form a basic granule (atom) of knowledge about the universe. Any union of some elementary sets is referred to as a crisp (precise) set - otherwise the set is rough (imprecise, vague). Each rough set has boundary-line cases, i.e., objects which cannot be with certainty classified, by employing the available knowledge, as members of the set or its complement. Obviously rough sets, in contrast to precise sets, cannot be characterized in terms of information about their elements. With any rough set a pair of precise sets - called the lower and the upper approximation of the rough set is associated. The lower approximation consists of all objects which surely belong to the set and the upper approximation contains all objects which possible belong to the set. The difference between the upper and the lower approximation constitute the boundary region of the rough set. Approximations are two basic operations in the rough set theory. Rough set based data analysis starts from a decision table, which is a data table columns of which are labeled by attributes, rows - by object of interest and entries of the table are attribute values. Attributes of the decision table are divided into two disjoint groups called condition and decision attributes, respectively. Each row of a decision table induces a decision rule, which specifies decision (action, results, outcome, etc.) if some conditions are satisfied. If a decision rule uniquely determines decision in terms of conditions - the decision rule is certain. Otherwise the decision rule is uncertain. Decision rules are closely connected with approximations, basic concepts of rough set theory. Roughly speaking, certain decision rules describe lower approximation of decisions in terms of conditions, whereas uncertain decision rules refer to the upper approximation of decisions. With every decision rule two conditional probabilities, called the certainty and the coverage coefficient, are associated. The certainty coefficient expresses the conditional probability that an object belongs to the decision class specified by the decision rule, given it satisfies conditions of the rule. The coverage coefficient gives conditional probability of reasons for a given decision. It turns out that the certainty and coverage coefficients satisfy Bayes’ theorem. That gives a new look into the interpretation of Bayes’ theorem, showing that Bayes’ theorem can be used differently to drawing conclusions from that offered by classical Baeysian inference philosophy. In the lecture rudiments of the theory will be outlined, and basic concepts of the theory will be illustrated by a simple tutorial example. Real life applications require more advanced extensions of the theory but we will not discuss these extensions in the lecture. Rough set theory has an overlap with many other theories dealing - 41 -

Aug-28-2001, 22:24

with imperfect knowledge, e.g., evidence theory, fuzzy sets and Bayesian inference. Nevertheless, the theory can be regarded as an independent, complementary - not competing discipline, in its own rights.

References [1] I. Dntsch and G. Gediga, Rough set data analysis - a road to non-invasive knowledge discovery. Meto(os Publisher, Bangor, Bissendorf, 2000. [2] S.K. Pal and A. Skowron (eds.), Rough Fuzzy Hybridization, Springer, 1999. [3] Z. Pawlak, Rough Sets - Theoretical Aspects of Reasoning about Data, Kluwer Academic Publishers, Boston, London, Dordrecht, 1991. [4] Z. Pawlak, Z., Decision rules, Bayes’ rule and rough sets, in: Zhong, N., Skowron, A., Ohsuga, S., (eds.), New Direction in Rough Sets, Data Mining, and Granular-Soft Computing, Springer, 1999, 1-9. [5] L. Polkowski and A. Skowron, (eds.), Rough Sets and Current Trends in Computing, Lecture Notes in Artificial Intelligence 1424, Springer, 1998. [6] L. Polkowski and A. Skowron (eds.), Rough Sets in Knowledge Discovery, Vol. 1-2, Physica Verlag, A Springer Company, 1998. [7] L. Polkowski, S. Tsumoto and T.Y. Lin (eds.), Rough Set Methods and Applications - New Developments in Knowledge Discovery in Information Systems, Springer, 2000 (to appear). [8] N. Zhong, A. Skowron and S. Ohsuga (eds.), New Direction in Rough Sets, Data Mining, and Granular-Soft Computing, Springer, 1999.

Aug-28-2001, 22:24

- 42 -

How to support GMP in model-based DSS Huub Scholten and Sjoukje A. Osinga Wageningen University, The Netherlands

Keywords: Keywords: Good Modelling Practice, Modelling and Simulation, Knowledge Base, Simulation Model Calibration

1. Introduction In more than 30 years modelling and simulation has been changed from an experimental, academic instrument to a frequently used tool in DSS. In the first decades of this period, quality assurance was treated merely in a theoretical way without direct support for modellers and other stakeholders. In the last couple of years, several national initiatives intended to improve the quality of modelling and simulation in general (Scholten and Udink ten Cate 1999), but especially for Model Based DSS (MBDSS) in water management (Scholten et al. 2000, Van Waveren et al. 2000). The use of models in water management is widely spread and models are imperative standard tools in this area. Models in water management belong to the Modelling & Simulation paradigm. These models couple nowadays a high spatial and process complexity to a predefined accuracy, which requires a sound model calibration by means of automatic parameter optimisation. Three of these national projects will act as knowledge source for a European project HarmoniQuA (=Harmonising Quality Assurance in model based catchment and river basin management), funded by the European Commission. First these three national projects will be discussed here briefly and the remainder of this paper will sketch the outlines of HarmoniQuA. In The Netherlands, a Good Modelling Practice (GMP) working group has started a project for the development of a GMP Handbook. The objective of this project was to stimulate the proper manner of dealing with models. The Dutch Department of Public Works, STOWA and DLO Staring Centrum financed the project within the AQUEST program. The project was executed by Wageningen Agricultural University, NITG-TNO and DLO Staring Centrum under the management of WL — DELFT HYDRAULICS and was supervised by a broad group of representatives from water managers, universities, scientific institutes and engineering offices. The Dutch GMP-project started with an inventory, in which relevant literature was consulted and the experiences from the organisations involved were charted. This formed the basis for the first draft of the ‘Good Modelling Practice’ Handbook. Next, both inexperienced and experienced modellers at various water management institutions verified the usability of the draft Handbook. The test period was concluded with a workshop at which the testers’ experiences were discussed. The final version of the Handbook (Van Waveren et al. 2000) is based on the findings during the test period. The Handbook is primarily intended to support the modeller. It deals with all major steps in the modelling process and is therefore very suitable for use as a checklist. Recording the procedures of the checklist (for instance in the forms appended in the Handbook) will create a model journal, which renders this model study reproducible and transferable, facilitating - 43 -

Aug-28-2001, 22:24

the involvement of other parties. In this sense, the Handbook is explicitly not intended as a compulsory straitjacket to the modeller, but rather as a technical tool (Scholten et al. 2000). In Denmark, a process has been initiated towards establishment of formal guidelines for good practice in groundwater modelling. The activities have comprised writing of a comprehensive handbook as course material and preparation of the first material for a guidance document. The partners actively involved in this process are research organisations (Geological Survey of Denmark and Greenland; DHI Water & Environment) on the one side and the Environmental Protection Agency and the regional water authorities (counties) on the other side. Furthermore, a first draft version of the guidelines were sent for comments to universities and consulting firms as well. This process will be guided into the same direction as the HarmoniQuA project during the course of the project. The results of this initiative will be published in a book. In the United Kingdom, informal discussions have taken place on model quality assurance topics between members of government science organisations (CEH Wallingford, formerly the Institute of Hydrology), representatives of the Environment Agency, representatives of the Ministry of Agriculture, Fisheries and Food, water industry personnel, consultants and academics. This involvement will be moved to a more formal basis in the course of the project and widened to include further stakeholder experience on the part of model users, professional societies and model producers in universities and the commercial sector.

2. Supporting GMP Three major groups of stakeholders can be distinguished, namely (water) managers, model developers and model users, each with their own responsibilities and with their own tasks in modelling and simulation in general and specifically in Good Modelling Practice. Any attempt to support these stakeholders in their tasks is based on the following three activities: 1. say what you will do 2. do what you promised 3. prove that you did what you promised When discussing HarmoniQuA, we try to support all stakeholders in each of these activities with emphasis on the manager/problem owner and on the model user. The overall goal of HarmoniQuA is to improve the quality of model based river basin management and enhance the confidence of all stakeholders in the use of models within single domains as well as in integrated assessments. An important precondition to achieve this goal is to develop a methodology and tools that facilitate an active dialogue between water managers and model users in such a way that they are guided through a proper quality assurance process. HarmoniQuA has four specific objectives: 1. To develop a generic, scientifically based methodology and a set of guidelines for the modelling process. The development of a generic methodology will be achieved on the basis of collected existing methodologies and guidelines (both generic and domain specific) and subsequent analysis, harmonisation, integration and improvement. Subsequently, the generic methodology will be translated into guidelines for seven specific domains: groundwater models, precipitation runoff models, hydrodynamic models (including sediment and morphology models), flood forecasting models, surface water quality models, biota (ecological) models, and socio-economic models. A widespread acceptance of the methodologies will be achieved through an active dialogue with the professional water management community. 2. To develop a knowledge base containing the methodology and guidelines (of objective 1) and tools using this knowledge base to support modellers and water managers throughout the quality assurance process. This will be achieved along two lines. First a knowledge base will be designed and filled with the knowledge collected and generated in the activities within the Aug-28-2001, 22:24

- 44 -

first objective. Further tools will be designed and developed (stand-alone and plug-in tools) that guide the model user and the water manager through the modelling process. These tools provide advice based on the established methodology and guidelines, different for the specific domain(s) and for the type of user/stakeholder relation. There will also be tools for recording/monitoring the modelling process activities in databases for reuse, and tools to reuse experience achieved from previous modelling projects within the organisation or from third parties. 3. To test the methodology/guidelines and the support tools in real-life cases. The weaknesses and strength of the developed functionality will be tested on a large number of real-life test cases covering a range of regimes and management conditions both for single domain and multi-domain/integrated models. The tests will be carried out in connection with ongoing modelling studies and focus on the applicability and consistency as perceived both by modellers and managers involved in the particular studies. 4. To disseminate the results of the project to users in the academic education sector, the water managers and model users and to other interested stakeholders such as planners, policy- makers and concerned members of the public. This will be achieved through web-based information, written material and a number of open workshops targeted towards different stakeholders.

3. Conclusion and discussion HarmoniQuA will supply a part of a modelling & simulation infrastructure for model based decision support in water management. This project is not a ‘stand-alone’ initiative, but is part of a cluster of projects all developing pieces of this infrastructure. The other projects will provide other pieces, i.e. a framework for developing new and coupling existing model modules, support for an efficient and effective model set up, tools to facilitate end-user participation in developing conceptual models, generic database structures for the wide range of water management models, and so on. This cluster of projects will not solve all problems of MBDS, but it will enable MBDS to be applied at a significant higher level. Other European initiatives will perfectly fit in this infrastructure and extend it to other model solving paradigms. In future the HarmoniQuA project will further act as a staircase to international standards for MBDS. Until then parts of the (Dutch) GMP Handbook will be transformed to the core of national standards for MBDS for water management in the Netherlands.

References [1] Scholten, H., and A. J. Udink ten Cate. 1999. Quality assessment of the simulation modeling process. Computers and Electronics in Agriculture 22:199-208. [2] Scholten, H., R. H. Van Waveren, S. Groot, F. Van Geer, H. Woesten, R. D. Koeze, and J. J. Noort. 2000. Good Modelling Practice in water management. in Proceedings HydroInformatics2000. International Association for Hydraulic Research, Cedar Rapids, Iowa, USA. [3] Van Waveren, R. H., S. Groot, H. Scholten, F. Van Geer, H. Woesten, R. Koeze, and J. Noort. 2000. Good Modelling Practice Handbook, 1.0 edition. STOWA, Utrecht, The Netherlands.

- 45 -

Aug-28-2001, 22:24

Efficient Sets and Surfaces in Multiple Criteria Optimization, Data Envelopment Analysis, and Portfolio Theory in Finance Ralph E. Steuer Terry College of Business, University of Georgia, Athens, USA

Keywords: Efficient Frontiers, Efficient Surfaces, DEA, Portfolio Theory, Multicriteria Optimization, Genetic Algorithms

The nuances and ramifications of efficient sets, frontiers, and surfaces are integrally involved in the study of multiple objective programming, data envelopment analysis (DEA), and portfolio analysis in finance. While the three areas have emerged from different origins, the usage of efficient sets and surfaces in the respective areas has shown increasing levels of commonality. Stressing the growing similarity in the tools necessary for conducting research on efficient sets in the three areas, the paper begins with a generalization of the traditional mean-variance Markowitz portfolio optimization model to also include skewness, social respomsibility, and the minimization of the number of securities in the portfolio. With this causing the efficient set to turn into a surface, an explanation, from a multiple criteria perspective, can be provided as to why, contrary to theory, market portfolios in practice reside deep below the efficient frontier. Now, if what has been traditionally been considered a frontier is actually a surface, perhaps even with dents and depressions in it, new research questions arise. In response to some of these questions, the resulting efficient surfaces are inveestigated for size, idiosyncracies, and distinguishing properties along with methods for discretly characterizing efficient surfaces using genetic algorithms. Moving on to DEA, the ”best practice” frontier is actually an efficient set, and when more than two output measures are employed, the frontier turns into an efficient surface as well. In this case, because the application is different, the size, idiosyncracies, and distinguishing properties of efficient surfaces in DEA are investigated for comparision with the efficient surfaces occurring in portfolio applications and in multicriteria optimization problems in general.

Aug-28-2001, 22:24

- 46 -

A Multi-Regional General Equilibrium Model Taking Account of Disaster Risk Hirokazu Tatano Disaster Prevention Research Institute, Kyoto University Yasuaki Shoji JR East Co. Ltd. Norio Okada Disaster Prevention Research Institute, Kyoto University

Keywords: Multi-Regional General Equilibrium Model, Disaster Risk, Welfare Analysis, NonLinear Comprlementary Problem

The paper aims at examining long-term effects of anti-disaster mitigation of infrastructure upon regional economy. A multi-regional general equilibrium model taking account of disaster risk is formulated as to conduct the analysis. The model is formulated as a non-linear comlementary problem and is solved by projection method. A natural disaster brings about catastrophic economic losses to regional economy which affected by the disaster. In the short run, individuals and firms cannot change their location. But, in the long run, they can change their locations. Anti-disaster mitigation of infrastructure does not only reduce damages caused by a disaster but also attract more individuals and firms to safer regions. Effects of anti-disaster mitigation upon regional economy are analyzed in long run and it is shown that disaster mitigation investment does not always make well off the welfare of households in the regions.

- 47 -

Aug-28-2001, 22:24

A greedy heuristic for carousel problems Jaap Wessels EURANDOM, Eindhoven, The Netherlands Nelly Litvak EURANDOM, Eindhoven, The Netherlands Ivo Adan Eindhoven University of Technology, Eindhoven, The Netherlands Henk Zijm University Twente, Enschede, The Netherlands Keywords: carousel system, heuristics, performance statistics, performance bounds

A carousel is a computer controlled warehousing system, which is widely used to store small and medium sized goods with slow-mover characteristics. One of the most important performance indicators of such systems is the pick time of an order consisting of several order lines. The order pick time mostly depends on the travel time of the carousel. In this paper we consider some reasonable heuristics for order picking. In particular we establish properties of the Nearest Item (NI) Heuristic. This heuristic is frequently used in practice. We derive tight upper bounds for the travel time under the NI Heuristic. The keys to this result are (1) the recurrent character of the NI Heuristic and (2) the fact that the NI Heurstic is always faster than the simpler Shorter Direction Heuristic. We also derive closed form expressions for the mean and variance of the travel time of the NI Heuristic given uniformly distributed positions of the items. In addition, we find the mean, variance and distribution of the travel time. The stochastic analysis is based on the property that the spacings of pick positions are distributed as normalized exponentials. Also a simple two-moment approximation for the distribution of the travel time is presented. The material of this presentation may be found in [1], whereas some theoretical extentions based on the theory of spacings may be found in [2] and [3].

References [1] N.Litvak, I.Adan, J.Wessels, H.Zijm: Order picking in carousel systems under the nearest item heuristic. Probability in the Engineering and Informational Sciences (to appear in the Spring of 2001). [2] N.Litvak, I.Adan: The travel time in carousel systems under the nearest item heuristics. Journal of Applied Probability ( to appear in Vol.38, No.1). [3] N.Litvak: Some peculiarities of exponential random variables. (submitted; now available as EURANDOM Technical Report).

Aug-28-2001, 22:24

- 48 -

Information and knowledge society, the role of intuition in decision making and a rational theory of intuition Andrzej P. Wierzbicki National Institute of Telecommunications, 04-894 Warsaw, Poland

Keywords: Information society, knowledge society, decision making, intuition

1. Introduction. Information society and knowledge economy. Possible definitions of knowledge. Increasing role of decision support. Knowledge and intuition. The necessity of better understanding intuition. Intuitive decisions: everyday and strategic. Analytical decision support versus unsupported intuition. High level decisions: why politicians do not like decision support. Examples of effectiveness of intuitive high level decisions. Japanese tradition of decision making. Philosophic aspects of intuition: Bergson, hermeneutics. 2. Rationality in economics versus rationality in philosophy. The impact of economic rationality on decision theory. Various schools of criticism. General and soft systems theory versus ”Mind over Machine”. The change of the way of decision making with increasing level of expertise. Rationality in philosophy of science. Historicism, changing paradigms, falsificationism, evolutionary epistemology - versus relativism. The need of a rational theory of intuition. 3. Elements of a rational theory: a thought experiment. Hemispheric asymmetry of brain. Cognition science, inner representation, psychology of cognition. The necessity of clarifying basic concepts: the role of thought experiments. Selected thought experiment: when did we start to think logically and analytically? The role of language in describing and observing the world. Two essential methods of thinking: verbal and pictorial. The role of fuzzy logic and the value of fuzzy definitions. 4. Rational theory of intuition. Conscious, subconscious, semiconscious thinking and decision making. A rational definition of intuitive decision making. The role of training in achieving good intuitive decisions. Phases of strategic intuitive decision processes. The test of rationality of a theory: what practical conclusions can we derive, are they testable? Example: how to structure negotiation processes? 5. Practical aspects of intuitive decisions. More about Japanese tradition. Zen philosophy, tee ceremony, knowledge management in organisations. How to support own intuition? How to combine analytical decision support with intuitive decisions? How to support strategic, high-level intuitive decisions.

- 49 -

Aug-28-2001, 22:24

Multiple Criteria Decision Making using Generalized Data Envelopment Analysis Yeboon Yun Faculty of Engineering, Kagawa University, Kagawa 761-0396, Japan Hirotaka Nakayama Faculty of Science and Engineering, Konan University, Kobe 658-8501, Japan Masao Arakawa Faculty of Engineering, Kagawa University, Kagawa 761-0396, Japan Hiroshi Ishikawa Faculty of Engineering, Kagawa University, Kagawa 761-0396, Japan

Keywords: Generalized data envelopment analysis, Multiple criteria decision making, Aspiration level method, Optimal design Problem

In multi-objective optimization problems, there exist a number of Pareto optimal solutions, which are considered as candidates of final decision making solution. It is an issue how decision makers decide one from the set of Pareto optimal solutions as the final solution. Specially, if the number of objective function is more than three, it is difficult not only to figure all Pareto optimal solutions in objective space, but also to decide a final decision making solution from them. In this paper, we suggest an aspiration level approach using generalized data envelopment analysis using genetic algorithms in multiple criteria decision making such as engineering design problems. And, we show that several Pareto optimal solutions closest to a aspiration level of decision makers can be listed up, as candidates of a final decision making solution, by the proposed method. Finally, through numerical examples, we prove that the aspiration level method using GDEA is very useful for supporting a decision making of complex management systems.

Aug-28-2001, 22:24

- 50 -

List of Participants

Prof. Masao Arakawa Dept. RISE, Kagawa University 2217-20 Hayashicho Takamatsu,Kagawa 761-0396 Japan email: [email protected] URL: http://www.eng.kagawa-u.ac.jp/~arakawa telephone: (81-87)-8642223 fax: (81-87)-864223

Dr Jerzy Bartnicki Norwegian Meteorological Institute Niels Henrik Abels vei 40 N-0313 Oslo Norway email: [email protected] URL: www.dnmi.no telephone: (47-22)-963000.3315 fax: (47-22)-696355

Prof. Adrie J.M. Beulens Applied Computer Sciences Group, Social Sciences Department, Wageningen University Dreijenplein 2 Wageningen Netherlands email: [email protected] URL: www.info.wau.nl telephone: (31-(0)317)-484460 fax: (31-(0)317)-483158

Mr Tadeusz Bewszko Rzeszow University of Technology Faculty of Electrical Engineering and Computer Science Department of Power Electronics and Electrical Engineering Pola 2 35-959 Rzeszow Poland email: [email protected] URL: http://www.zee.prz-rzeszow.pl/~tbewszko/ telephone: (48-17)-852 44 07 fax: (48-17)-854 20 88

Ms Svetlana Churkina Department of Operations Research Faculty of Applied Mathematics and Cybernetics Moscow State University Vorobiovy Gory 117234 Moscow Russia email: [email protected] telephone: (7-095)-3303610 fax: (7-095)-9247464 Prof. Joao Climaco Faculdade de Economia Universidade de Coimbra Av. Dias da Silva 165 3000 Coimbra Portugal email: [email protected] telephone: (351-239)-790500.595 fax: (351-239)-824692 Dr Tatiana Ermolieva International Institute for Applied Systems Analysis Schloss Plz., 1, 2361, Laxenburg, Austria Austria email: [email protected] URL: www.iiasa.ac.at telephone: (43-2236)-807.581 fax: (43-2236)-71.313 Dr Motohisa Funabashi Systems Development Laboratory, Hitachi, Ltd. 1099 Ohzenji, Asaoku, Kawasaki 215-0013 Japan email: [email protected] telephone: (81-44)-959-0215 fax: (81-44)-966-4673 Dr Robert Genser IFAC-Beirat Austria Malborghetgasse 27-29,6/6 1100 Vienna Austria email: [email protected] telephone: (+43-1)-6074187 fax: (+43-1)-6074187 - 51 -

Aug-28-2001, 22:24

Dr Janusz Granat National Institute of Telecommunications Szachowa 1 04-894 Warsaw Poland email: [email protected] telephone: (48-22)-6607640 fax: (48-22)-8253719 Prof. Manfred Grauer University of Siegen Institute of Information Systems Hoelderlinstr. 3 57068 Siegen Germany email: [email protected] URL: http://www-winfo.uni-siegen.de telephone: (0049-271)-740.3269 fax: (49-271)-740.2372 Dr Jifa Gu School of Knowledge Science Japan Advanced Institute of Science and Technology 1-1, Tatsunokuchi, Ishikawa, 923-1292 Japan email: [email protected] telephone: (81-761)-511725 Dr Kiyotada Hayashi National Agricultural Research Organization Natl. Agr. Res. Ctr. (Tohoku) 4 Akahira, Shimo-Kuriyagawa Morioka, Iwate 020-0198 Japan email: [email protected] telephone: (81-19)-643-3491 fax: (81-19)-641-7794 Dr Ayako Hiramatsu Osaka Sangyo University 3-1-1 Nakagaito Daito Osaka 574-8530 Japan email: [email protected] telephone: (81-72)-8753001.7633 fax: (81-72)-8701401 Prof. Olgierd Hryniewicz Systems Research Institute Newelska 6 PL-01-447 Warszawa Poland email: [email protected] URL: www.ibspan.waw.pl/~hryniewi/ telephone: (48-22)-8364414 fax: (48-22)-8372772 Aug-28-2001, 22:24

Dr Stefan Kaden WASY Ltd. Waltersdorfer Strasse 105 D 12526 Berlin Germany email: [email protected] URL: www.wasy.de telephone: (49-30)-6799980 fax: (49-39)-67999899

Dr Lech Krus Systems Research Institute, Polish Academy of Sciences Newelska 6, 01-447 Warsaw, Poland email: [email protected]

Prof. Daniel (Pete) Loucks Civil and Environmental Engrg. Cornell University 311 Hollister Hall Ithaca, New York 14853 USA email: [email protected] telephone: (1-607)-255 4896 fax: (1-607)-255 9004

Prof. Mikulas Luptacik Department of Quantitative Economics Vienna University of Economics and Business Administration Augasse 2-6 1090 Vienna Austria email: [email protected] URL: www.wu-wien.ac.at/wwwu/institute/vw6/ tafel.html telephone: (+43-1)-31336.4543 fax: (+43-1)-31336.755

Dr Marek Makowski IIASA Schlossplatz 1 A-2361 Laxenburg Austria email: [email protected] URL: www.iiasa.ac.at/~marek telephone: (43-2236)-807.561 fax: (43-2236)-71.313 - 52 -

Prof. Wojtek Michalowski Faculty of Administration University of Ottawa 136 Jean-Jacques Lussier St. Ottawa, Ont. K1N 6N5 Canada email: [email protected] URL: http://www.admin.uottawa.ca/wojtek/ telephone: (1-613)-562-5800.4955 fax: (1-613)-562-5164

Prof. Yoshiteru Nakamori School of Knowledge Science JAIST 1-1 Asahidai, Tatsunokuchi Ishikawa, 923-1292 Japan email: [email protected] URL: http://www.jaist.ac.jp/~nakamori telephone: (81-761)-511755 fax: (81-761)-511149

Prof. Hirotaka Nakayama Konan University Department of Information Science and Systems Engineering 8-9-1 Okamoto, Higashinada Kobe 658-8501 Japan email: [email protected] telephone: (81-78)-435.2534 fax: (81-78)-435.2540

Prof. Wlodzimierz Ogryczak Warsaw University of Technology Institute of Control & Computation Eng. Nowowiejska 15/19 00-665 Warsaw Poland email: [email protected] telephone: (48-22)-660 7862 fax: (48-22)-825 3719

Dr Juan A. Ortega Departamento de Lenguajes y Sistemas Informaticos University of Seville Avda. Reina Mercedes s/n 41012 - Seville Spain email: [email protected] URL: http://www.lsi.us.es/~ortega/ telephone: (34-95)-4552773

Prof. Zdzislaw Pawlak University of Information Technology and Management Newelska 6 01-447 Warsaw Poland email: [email protected] telephone: (48-22)-8345659 fax: (48-22)-8251635

Dr Mina Ryoke School of Knowledge Science Japan Advanced Institute of Science and Technology 1-1 Asahidai, Tatsunokuchi Ishikawa 923-1292 Japan email: [email protected] URL: http://www.jaist.ac.jp/~ryoke telephone: (+81-761)-51-1757 fax: (+81-761)-51-1149

Mr Huub Scholten Wageningen University Applied Computer Science Group Building 313 Dreijenplein 2 6703 HB Wageningen Netherlands email: [email protected] URL: http://www.info.wau.nl/people/huub_ scholten/huub.htm telephone: (+31-317)-484631 fax: (+31-317)-483158

Dr Ralph E. Steuer Department of Banking & Finance Terry College of Business University of Georgia Athens, Georgia 30602-6253 USA email: [email protected] URL: www.terry.uga.edu/~rsteuer/bio-1.htm telephone: (706)-5423782 fax: (706)-5429434

Dr Hirokazu Tatano Disaster Prevention Research Institute Kyoto University Goka-sho Uji, 611-0011 Japan email: [email protected] telephone: (81-774)-384308 fax: (81-774)-384044 - 53 -

Aug-28-2001, 22:24

Dr Jesus Torres Departamento de Lenguajes y Sistemas Informaticos University of Seville Avda. Reina Mercedes s/n 41012 - Seville Spain email: [email protected] URL: http://www.lsi.us.es/~jtorres/ telephone: (34-95)-4552769 Prof. Jaap Wessels EURANDOM P.O.-Box 513 NL 5600 MB Eindhoven Netherlands email: [email protected] URL: eurandom.tue.nl telephone: (31-40)-2478110 Prof. Andrzej P. Wierzbicki National Institute of Telecomunications Szachowa 1 04-894 Warsaw Poland email: [email protected] URL: www.itl.waw.pl telephone: (48-22)-5128448 fax: (48-22)-512.8726 Dr Yeboon Yun Department of Reliability-based Information System Engineeri Faculty of Engineering Kagawa University 2217-20 Hayashicho, Takamatsu, Kagawa 761-0396 Japan email: [email protected] telephone: (81-87)-864.2246 fax: (81-87)-864.2246

Aug-28-2001, 22:24

- 54 -