Requirements Engineering for Complex Collaborative Systems

5 downloads 0 Views 395KB Size Report
problem of analysing requirements in large-scale systems involving people and .... can be ignored. Warning. Interpret hazard and respond to warning. Request. Response .... An enemy aircraft launches a missile, which is detected by the ship's ..... oriented software engineering: a use-case driven approach. Reading MA: ...
Requirements Engineering for Complex Collaborative Systems Alistair Sutcliffe Centre for HCI Design, Department of Computation, UMIST e-mail: [email protected]

Abstract A method for analysing requirements for complex sociotechnical systems is described. The method builds on the I* family of models by explicitly Modelling communication between agents by discourse act types. System (i*) models and use cases are developed which describe the dependencies between human and computer agents in terms of a set of discourse acts that characterise the obligations on agents to respond and act. For humancomputer communication, the discourse acts indicate functional requirements to support operators. For human agents the acts specify their obligation to act and constraints on action. The method provides analytic techniques and heuristics to assess agents’ workloads in terms of the tasks and communication they have to perform. Scenarios are run against the system model by walking through the chain of operator tasks and communication links to produce time estimates and failure probabilities where the demands of scenarios impose excessive loads on human operators. The method is illustrated with a case study of a naval command and control system. Keywords: requirements communication, discourse acts

analysis,

CSCW,

1. Introduction Requirements engineering has paid little attention to the problem of analysing requirements in large-scale systems involving people and technology (i.e. socio-technical systems), even though such systems frequently fail, e.g. [1]. Ethnographic techniques have been applied to gather data on social issues and requirements do emerge from this process [2]. However, there is little generalisable knowledge, models or analytic methods that can be gleaned from ethnography, so the quality of requirements analysis is dependent on the practitioner's experience. The Inquiry Cycle [3] uses scenarios to investigate barriers to effective use (called obstacles) that may arise in the social domain, while stakeholder analysis methods [2],

[4], advise modelling requirements according to different user categories or viewpoints. More formal models in the i* family do provide semantics for describing agents, goals and tasks and the dependencies between them, and support reasoning about socio-technical system implications such as the allocation of responsibilities to agents [5]. However, i* does not address inter-agent communication in the sense of spoken discourse. Modelling how communication is integrated with action is a key concern for understanding the requirements for collaborative computing systems. This paper also explores the problem of dependency analysis in socio-technical systems by proposing a method for modelling and analysing event flows between users and the intended system in order to derive high level requirements. We extend the i* method by addressing workflow problems via a coupling analysis derived from concepts in organisational theory [6] and linguistics [7]. The paper is organised in four sections. The next section introduces the method, with a more detailed description of the inter-agent coupling analysis. A case study of a command and control system illustrates the method. The paper concludes with a brief discussion.

2. Method description The coupling analysis method compares scenarios with requirements specifications and models, focusing on events or information flows between agents in the system and its environment. Scenarios contain descriptions of events that the system has to respond to with environmental details gathered from real-life examples. The method is iterative, so once the system model has been created, all the other stages proceed concurrently. Stage 1. Use case and system modelling Use cases specify interactions between agents following standard OO procedures (see [8], [9]), to map out the sequential dependencies of event flows between user agents and the system. System models use an extended version of the i* notation [10] to represent agents, tasks, goals, non-functional requirements (soft goals), resources and dependency relationships between them. The coupling analysis extensions add further relationships to model authority, responsibility and trust.

Stage 2. Communication and task analysis Communication analysis: Event flows between agents are classified using discourse act types (see table 1) with an associated weighting that expresses the dependency imposed on the receiver (i.e. dependee in i*) in terms of responding to the communication. The pattern of message flows inbound/outbound to each agent is investigated to determine the agent’s ability to respond to the demands of others. Table 1. Discourse acts for analysing dependencies in human-human communication. Discourse act from sender

wt

Implication for receiver

Inform (give information) Report: optional

1

Report: mandatory

2

Request: information

4

Acknowledge, process or discard Respond/process only if interesting Acknowledge and process Retrieve information, reply (inform) Confirm or correct fact being checked Continue (response to a check) Reassess beliefs Continue with proposal/suggestion Reassess proposal, change position Reaction required (agree/disagree) Response required (agree, augment, disagree) Reaction necessary Recipient must obey Obey and follow instructions Obey following complex instructions

Check: fact or belief

1

3

Confirm: fact or belief

2

Correct: fact or belief Agree with argument or proposal Disagree with argument or proposal Propose action

3 2

Propose idea

4

Warn Command: simple order Command + constraints

5 6 8

Command + multiple constraints

10

4 4

The weightings represent the dependency between the originator of the message and the receiver in terms of the implications for action and decision making by the receiver, following concepts in Clark’s theory of Common Ground [7] and discourse action theories [11]. Weightings are specialised for different domains from the generic defaults given in table 2.

The discourse acts have different degrees of force or obligation they impose on the receiver. Requests, checks, informs and reports generally only impose the obligation to provide or process information; whereas disagrees, corrections and warnings necessitate reasoning, while proposals require the receiver to consider the proposition and then agree or disagree. Commands have the highest degree of force because they necessitate a reasoning and action; furthermore the receiver has less choice, and hence both agents have a higher mutual dependency. These dependencies are reflected in the weighting. The discourse acts are aggregated into patterns that describe a goal-related exchange between two agents. The command pattern for strict hierarchies, is illustrated in figure 1. The command should be followed to the letter of the law, and the only response expected is acknowledgement of the command and reporting back on completion, however, more the loosely coupled command pattern allows the subordinate more freedom of action and implies more trust in their abilities.

(a)

(b)

Command

Report back

Acknowledge

Command by objectives

Inform: success criteria

Report back: goals achieved

observe/ monitor Carry out actions

Check (progress)

Acknowledge/ clarify Carry out actions

Figure 1. Command patterns (a) strict hierarchy, and (b) decentralised Command patterns not only have different forces and level of trust in the subordinate but also impose constraints on action. The recipient may be told to either carry out an action in some detail, or may be given a higher level goal with considerable freedom to interpret the plan of action on their own. Command couples are rated for the dependency they impose on the recipient agent (optional, mandatory, commands with constraints) and the restriction on freedom of action imposed on the recipient. Where high levels of command coupling are apparent these indicate areas of inflexibility and possible system failure. Examples are one commander having too many subordinates or one subordinate with several commanders (high fan in or fan out command structures). The acts and their patterns are used to evaluate dependencies between human agents, and

to locate areas where communication may fail or system responses may not meet the demands imposed by a particular scenario.

requirements for the system, as illustrated in table 2. . If intelligent software is required discourse acts for propose ideas and agree/disagree etc are added.

Table 2. Mappings from discourse acts to generic functional requirements

Requirements Implications: Classification of human inputs and environments or events indicates functional requirements to process information; in contrast, computer output imposes obligations on users to take decisions (information outputs), act in response to messages (alerts, warnings) or obey explicit instructions (commands).

Human/environment-to-computer Routine event Significant event

Generic functional requirements Validate input, record or ignore Validate, record, interpret, process, respond Discretionary input Set parameters, update profiles Mandatory input Validate input, ensure capture, log input, halt if no response process Hazardous event Record, interpret, notify user, decision support, respond Request for info. Check request, information search/retrieval Command Execute system function indicated by the command prompt, display results Command + Interpret command parameters, constrained actions execute process, display results Command + Interpret command parameters, multiple constraints send commands to other computers, monitor action, report back. Computer-toImplications for human users/stakeholders Discretionary Discretionary use no information dependencies Decision-related Agent needs information to take information decisions Essential info. Necessary for task or user action Report Task complete, display status – note message Alert Message that requires attention, and possibly action Proposal Assess proposal, agree/disagree – can be ignored Warning Interpret hazard and respond to warning Request Response necessary: data input Command Agent must carry out an action Command + Agent’s way of working is constrained actions controlled by the system – must respond The discourse acts are specialised for assessing humancomputer interaction and indicating functional

Table 3. Task complexity weightings (Wt) for typical command and control tasks. Task Monitor

Wt 1

Interpret

3

Analyse Plan (action /task) Decide

5 7

Direct (individual) Operate Liaise Coordinate (team) Supervise (team)

4

4

2 6 8 9

Notes Detect significant events, tracking objects Attach meaning to event, identify objects Analysis of situation Strategy, tactics or operations, result in command Take decision and take action; may be more complex with more options Prepare instructions for a command Take action with equipment Collaborative decision making Involves directing, and monitoring Planning, monitoring and decision making, depends on size of team

The discourse acts necessary for communication between agents are counted to determine the level of inbound and outbound coupling for each agent. Agents with high coupling indicate potential problems. The motivation for avoiding high coupling is twofold. First, it decreases the flexibility of inter-operator or user-system interaction and decreases autonomy. Secondly, too much control centered on any one agent is likely to increase human error because of the cognitive burden of issues, commands and monitoring progress. When coupling scores are high, commands imposed by the system on the user should be reduced; for instance, by reallocating the work so only the user is responsible. Increasing autonomy of agents and decomposing the system into sub-systems can also reduce coupling. Closely coupled areas within the system are reviewed to either change the design for

more automation (computer autonomy) or to increase human control, and design the computer system for an advisory rather than a controlling role. Task Analysis: The task weightings are given in table 3 which were derived from a survey of task and work analysis literature (e.g. Hollnagel 2000). Tasks are assessed independently of communication although dependencies are taken into account when system models are created. e.g. a Command may lead to execution of an automated function (requirements) or a human task (monitor, operate equipment, etc). The weightings are judged from estimates of complexity and skill, so simple skilled tasks receive low weights whereas complex, knowledge-based tasks which require reasoning receive higher weights. These reflect both the inherent complexity of the task and the level of operator training. Cognitive complexity of tasks is higher in conscious problem solving when we are unfamiliar with the domain (i.e. new situations which we have not encountered before). Complexity is reduced by training which creates skilled behaviour [12]. Simple skilled actions (understand a stop sign, steer a car) have low complexity because we run these tasks as precompiled procedures. In contrast, tasks which can not be completely learned are more complex, rule-based event interpretation or decision making (e.g. understand road hazard and slow down). The most complex tasks require knowledge-based reasoning, planning and problem solving and conscious effort (e.g. decide which route to taken, plan how to get to destination via shortest pathway). The generalised command and control task cycle used to analyse operational scenarios starts with an event being detected (monitor) and interpreted. The event is then analysed in the context of the situation, before the commander decides what action to take. If the situation is complex or unfamiliar the response will have to be planned before instructions can be formulated to take appropriate action (Direct which leads to Operate). The front end of the cycle tends to be the responsibility of junior personnel who are trained rigorously, and hence complexity is low. Senior officers tend to be responsible for the middle tasks (analyse, plan and direct) which are more knowledge intensive and complex. Group working in coordination and supervision adds more loading because reasoning and communication are involved. Stage 3. Dynamic (scenario) analysis The system model is run against one or more scenarios using a walkthrough approach. Each inbound event from a human agent or from other sources in the system environment indicates a requirement for a system

response. Events are traced through the system model from the input to output. The implications for each agent are assessed by counting the tasks they have to carry out and the messages sent or received to process the event. This produces a cognitive workload analysis for each agent in terms of communication and tasks. The scenario is segmented into time intervals either of equal duration or irregular intervals determined by key events within the scenario. Within each scenario interval the workload of each agent is summed to give: • Task loading in terms of task complexity and estimated completion times. • Communication loading expressed as the discourse acts weightings. These outputs enable potential failure points in the system model to be pinpointed when human operators may not be able to carry out their duties because there is insufficient time to perform the task or the complexity demands of the task and communications will probably exceed their capabilities. Since many systems may have to respond to more than one event concurrently, scenario walkthroughs are traced to investigate the loading of multithreaded interaction on each agent within set time intervals.

3. Case study The system is an early design for a naval offshore patrol vessel. Preliminary scoping of the requirements is represented in the system model illustrated in figure 2. Dependencies have been analysed from the PWO’s viewpoint; for instance PWO (Principle Weapon’s Officer, who is directly subordinate to the Captain) depends on the TPS (Tactical Picture Compiler) and APC/SPC the Air and Surface Picture Compilers to supply interpreted information about threats detected by radar. Other subordinate officers, WDB, WDV and EWD (Weapons Directors Blind –missiles, Visual –guns and Electronic Weapons Director), depend on PWO for instructions while they are depended on to carry out their allocated duties of engaging or evading hostile units. The command and control event flows commence with an external event being detected by APC/SPC who monitor the ship’s radars and notify the TPC. The TPC interprets significant events (i.e. friendly unit or potential threat), integrates the input from several operators and passes the information on to the Principal Weapons Officer (PWO), who is responsible for analysing the situation and planning appropriate action, in consultation with the Captain (CMD). Once the PWO has decided how to respond, orders are passed down the line of command to one or more of the Weapons Directors: WDB who is in

charge of missiles, WDV who commands the ship’s guns, and EWD who controls radar jamming and electronic counter-measures.

3.1. Static analysis This shows that the PWO has a high task load and coupling factor because of this role’s responsibility for several input report streams and many output command streams, as well as the load of liaising with the Captain and other officers responsible for navigating the ship (Officer on Watch: OOW). PWO’s high task loading is a consequence of the knowledge-based tasks of analysing the tactical situation, planning responses, taking tactical decisions, directing subordinates while also liaising with the Captain and supervising the whole combat operations room (total complexity loading 35). The WDB also has a high coupling because the line of command to WDV goes through this role. However, these officers are selected for their ability to cope with complex multitasking and stressful decision making. The question for the dynamic analysis is whether too much is being demanded of their abilities. Engage h o s t i le

W DB EW D

Engage h o s t i le

Jam h o s t i le

A n a ly s e s it u a t io n

A v o id th re a t

OOW

W DV

PW O

P la n a c tio n

CMD

M a in t a in s a fe ty

In te rp re te d th re a t TPS

D e te c te d th re a t

APC

A c c u ra te d is p la y / m o n

Radar

Figure 2. Strategic dependency model of the ship’s combat sub-system expressed in i* notation.

3.2. Dynamic analysis Space precludes reporting a complete analysis of every role, so only the PWO role, identified as being potentially problematic in the static analysis, is analysed in more depth. Two scenarios were run against the system model. The scenario narratives with their implication for tasks and dialogue exchanges (illustrated as TPSÆPWO) are as follows:

(a) Air launched missile threat An enemy aircraft launches a missile, which is detected by the ship’s radar. The threat is reported by the APS to the TPS who assess the threat in context (i.e. other friendly or hostile units in the vicinity). The TPS reports the incoming missile threat to the PWO. The Principal Weapons Officer decides on an initial response to jam the missile’s radar (PWOÆEWD) in the hope it will fail to lock on to the ship, while informing other friendly ships in the area of the threat via the communications yeoman (PWOÆCMY). The missile keeps homing (APSÆTPSÆPWO) so the next decision is to increase speed and take evasive manoeuvres (PWOÆOOW) as well as firing decoy chaff (PWOÆ WDB). Unfortunately the missile still keeps homing (APSÆTPSÆPWO) so they finally decides to destroy the hostile missile with an air defence missile (PWOÆWDB). Luckily this is successful. The walkthrough data analysis for PWO role in this scenario is shown in table 4. The scenario is divided into five phases that correspond to key events in the sequence. Most tasks are trained procedures so their complexity is low; however, situation analysis and planning the response in phase 1 requires conscious reasoning as does deciding what to do in phases 2 and 3; consequently these knowledge-based tasks receive high weightings. This pattern continues in the next three phases. The PWO has a high task load for planning and decision making as the situation changes. Moreover the PWO has to monitor the situation on computer displays at the same time. When the communication pattern is assessed this shows several communication paths that the PWO has to deal with concurrently, e.g. receiving reports from the TPS/APS, issuing commands to EWD and WDB, while liaising with the Captain and Officer on Watch. Having to issue and monitor several lines of command as well as dealing with complex tasks already suggests role overloading for the PWO.

(b) Surface terrorist threat The second scenario commences with an unknown small boat approaching the ship at high speed. The unknown vessel is detected by the surface radar monitored by the Surface Picture Compiler (SPCÆTPSÆPWO). The Principal Weapons Operator decides to alert other friendly units of the situation by radio and requests intelligence information on possible hostile forces in the area. The unknown vessel keeps on approaching (SPCÆTPCÆPWO) so the PWO requests radio contact with the vessel to ascertain its identity and warn it not the come any closer (PWOÆCommunications Yoeman-

CMY). In this phase the PWO also liases with the Captain and Officer on Watch about changing course to avoid the threat. No response is received from the unknown vessel which keeps on closing (SPCÆTPSÆPWO) so after further consultation with the Captain PWO decides to order warning shots to be fired

(PWOÆWDBÆWDVÆguns). This produces no reaction from the unknown vessel (SPCÆTPSÆPWO), leading to the final phase when the Captain orders PWO to destroy the assumed terrorist target (PWOÆWDBÆWDVÆ guns).

Table 4. Task and communication loading for PWO role, scenario 1 (Missile attack). The time segments in this scenario are irregular because they were set by the customer (i.e. navy personnel). Scenario phase and time (mm:ss) Phase description

Phase 1 0 – 6:50

Phase 2 9:30

Phase3 9:56

Phase4 10:05

Phase 5 10:31

Detect aircraft +EW jam

Missile launch + lock on + ECM

Analyse tactical situation (5) Plan reaction (7) Direct APC IFF (4) Direct EWD jam target (4)

PWO communications

TPC-PWO (3) PWO-CMD (5) PWO-EWD (6) TPS-PWO (4)

Analyse tracks(5) Coordinate tactical comm-unications (8) Decide ECM (4) Direct EWD (4) Liaise OOW: avoiding action (6) TPS-PWO (X2) (6) PWO-CMY (6) PWO-EWD (6) PWO-OOW (2 +4)

Missile homing. Decide kill options Analyse tracks(5) Plan reaction (7) Liaise with CMD: options (5) Direct WDB engage with missiles (4) PWO-CMD (3) CMD-PWO (8) PWO-WDB (8) WDB-PWO (3)

Hard kill

PWO tasks

Missile homing. Decide launch decoys Analyse tracks(5) Plan reaction (7) Coordinate warfare plan (8) Direct EWD (5)

This scenario also shows a high cognitive load on the PWO role. The histogram plots depicted in figure 3 indicates the loading spread over time. In early phases of each scenario the high loads are spread over several minutes, so the overall load may be sustainable; however, in 3 and 4 of both scenarios the load hardly decreases yet the time interval is much shorter. This is where the stress point would occur and errors are probable.

TPS-PWO (3) PWO-CMD (5) PWO-EWD (6)

Monitor progress (1) Liaise with OOW (5) Reports to CMD WDB-PWO (3) PWO-CMD (3)

Scenario 2 Terrorist Surface Task loading

Communications loading

30

Task Weighting/ Coupling

25

20

15

10

Scenario 1 Air launched missile threat 5

Task loading

Communications loading 0

30

Task Weighting/ Coupling

5

6

7

8

9

10

11

12

Time ( Mins ) Phase (1)

(2)

(3)

(4) (5)

25

Figure 3. Task and communications loading on the PWO role for each scenario by phase

20

15

10

5

0

5

6

7

8

9

10

11

12

Time (mins) Phase (1)

(2) (3) (4)

(5)

If the two scenarios were to happen concurrently they would impose an intolerable load on the PWO, especially in phase 3-4 where this agent would have to analyse, plan and direct two concurrent threats while trying to liaise with other officers. The command coupling with both the

WDB and EWD roles as well as two concurrent analyse, plan and direct task cycles point towards an increased probability of human error [13] especially considering decisions have to be made under time pressure. Furthermore, the WDB (Weapons Director Blind) also has a high loading because all the communication between PWO and WDV is routed through WDB. If the two scenarios were to happen concurrently the WDB role becomes untenable because it is acting as a communication router between PWO and WDV while also having to handle a high load of communication with several other personnel in the missile attack. The lines of communication need to be re-structured so PWO and WDV can communicate directly while liaising with WDB. Also PWO’s role needs to be simplified to offload some of the decision making from one person.

4. Change Impact Analysis Coupling analysis also supports reasoning about the possible impact of change in socio-technical systems design. One trend in complex systems engineering is to reduce the complex of operator tasks by increasing automation. While the impact of automation on task loading is obvious, coupling analysis also highlights the implications on changing communication from human-tohuman to human-to-computer. To illustrate this we take a scenario of replacing the APC role with an intelligent identify friend/foe analyser. This will identify all radar contacts, interpret them as possible friend or foe and notify the TPS of any significant contacts. The Monitor and Interpret tasks are automated, hence the TPS now has a human-computer interface. The coupling analysis heuristics indicate that a check-clarify discourse pattern is appropriate unless the human operator can completely trust the computer. Given that the identification of friend or foe is never 100% reliable, complete trust is unlikely. The discourse patterns for the TPS-Intelligent radar is given in figure 4a (manual system) and 4b (automated system). (a) original manual system Radar Event (2)

APC

TPS

PWO

EWD

Monitor Report alert (5) Interpret Acknowledge Warn (6) Check (2) Confirm

Analyse

Plan Direct Command (8)

Act

(b) automated system, intelligent identify friend/foe Figure 4. Discourse patterns for TPS-Intelligent radar. EW D

PW O

TPS

Radar

M onitor Interpret W arn (6)

Explain

Interpret (3)

Check (3) Confirm (2)

W arn (6)

Analyse (5)

Plan (7)

Direct (4) Com m and (8)

Act (7)

When intelligent technology is introduced the communication load of the TPS actually increases, because of the need to check and clarify the interpretations made by the smart radar device; furthermore, the TPS’s task of interpreting input data does not go away. The APS role is eliminated so the running cost in personnel is reduced. Hence the TPS role becomes more complex unless absolute trust is placed in the smart radar. This is undesirable because it increases the dependency of users on the accuracy and reliability of the device. In this manner coupling analysis provides a tool for thought to investigate the implications of automation as a trade-off between automation (and inter alia functional requirements, non-functional requirements, and the socio-technical dependencies between computers and people.

5. Lessons learned The above analysis was carried out by the author in collaboration with domain experts from BAE systems. The results were presented to BAE systems personnel who found the method provided insights that they would not have uncovered by their conventional task analysis approach; hence the method was potentially useful. They noted that the implications on role overloading, while intuitively guessable, were made very clear by the coupling analysis and that the scenario assessment was important for validating overall system effectiveness. They commented that the weightings and task complexity metrics needed to be tuned with data they could provide from experience, and overall they wanted to progress the study to the next stage of learning, to use the method themselves, and to evaluate it on other projects. This work is now underway in the SIMP project.

6. Discussion The coupling analysis method has advanced requirements engineering in the key area of complex systems. We acknowledge our debt to i*; however, coupling analysis goes further by a scenario analysis of socio-technical system designs. Analysis of task complexity and workload has been practiced in the HCI-systems engineering field for many years [14], but explicit consideration of discourse is novel. We believe modelling discourse is important because it can account for a considerable proportion of the cognitive activity in collaborative systems; furthermore, the software requirements implications of supporting human communication are rarely considered in requirements engineering. While some may argue that ethnographic approaches are necessary to capture the subtleties of complex socio-technical interactions [2], we believe that strong model approaches [10] are the answer, that can be coupled with ethnographic insights. Scenario-based analysis in safety critical systems has been described by Wright et al. [15] who propose a heuristic-based investigation that points to critical design features that are then modelled more formally (e.g. aircraft engine fire control procedures). Whereas their approach addresses analysis of an existing design in some detail, the coupling analysis takes a more wide-ranging view of collaborative systems at an early design stage. It extends the concept of obstacle analysis in the Inquiry Cycle [3] by providing a more systematic technique for identifying obstacles in communications and functional allocation of tasks to people instead of computers. The discourse analysis also helps to identify potential problems from the users’ point of view with trust of technology. Coupling analysis fits within the tradition of distributed cognition and is an attempt to formalise the intuitions of such work into a more useful analytic instrument. There are many areas for future development. First the weightings and complexity metrics need to be customised to particular domains to improve their predictive power, although we argue that it is not the absolute values but the differences between scenario runs which provide productive insights. Coupling analysis is currently a paper-based, labour intensive process, so we intend to develop a modelling tool that will perform the static analysis and scenario walkthroughs automatically. The CREWS-SAVRE [16] tool that produces scenario variations is one starting point. In spite of these limitations, the first version of the method has proved its worth in diagnosing complex socio-technical systems designs and demonstrating weak points therein. It has also made a start on the more complex problem of defining

requirements for intelligent agents and the impact these may have on socio-technical systems performance.

7. References [1] HMSO Report of the Inquiry into the London Ambulance Service. London: HMSO, 1993. [2] Sommerville, I. & Sawyer, P. Requirements engineering: a good practice guide. London: Wiley, 1997. [3] Potts, C., Takahashi, K. & Anton, A.I. Inquiry-based requirements analysis. IEEE Software 11(2), 1994, pp. 21-32. [4] Macaulay, L.A. Requirements engineering. Berlin: SpringerVerlag, 1996. [5] Mylopoulos, J., Chung, L. & Yu, E. From object-oriented to goal-oriented requirements analysis. Communications of the ACM 42(1), 1999, pp. 31-37. [6] Clegg, C., Axtell, C., Damodaran, L., et al. Information technology: a study of performance and the role of human and organisational factors. Ergonomics 40(9), 1997, pp. 851-871. [7] Clark, H.H. Using language. Cambridge: Cambridge University Press, 1996. [8] Jacobson, I., Christerson, M., Jonsson, P., et al. Objectoriented software engineering: a use-case driven approach. Reading MA: Addison Wesley, 1992. [9] Rational Corporation. UML: Unified Modelling Language method. Available online at Accessed 1999. [10] Yu, E.S.K., Mylopoulos, J. & Lesperance, Y. AI models for business process reengineering. IEEE Expert, August, 1996, pp. 16-23. [11] Winograd, T. & Flores, F. Understanding computers and cognition: a new foundation for design. Reading MA: Addison Wesley, 1986. [12] Rasmussen, J. Information processing in human computer interaction: an approach to cognitive engineering. Amsterdam: North Holland, 1986. [13] Reason, J. Human error. Cambridge: Cambridge University Press, 1990. [14] Bailey, R.W. Human performance engineering: a guide for system designers. Englewood Cliffs NJ: Prentice Hall, 1982. [15] Wright, P., Fields, R. & Harrison, M.D. Modelling humancomputer interaction as distributed cognition. Available from Dept of Computer Science, University of York, UK, http://www.cs.york.ac.uk. Accessed 1999. [16] Sutcliffe A.G., Maiden N.A.M., Minocha S. and

Manuel D. (1998), Supporting Scenario based requirements engineering. IEEE Transactions on Software Engineering 24(12) 1072-1088

8. Acknowledgments The author would like to thank personnel in BAE systems, Broadoak, in particular, thanks to David Corrall (manager of the EPSRC SIMP project), John Potter, Colin Clark and Leroy Osbourne. Thanks also to Jae-Eun Shin and Andreas Gregoriades for their help in preparing this paper. This work was partially funded by EPSRC Systems Integration Programme SIMP project (Systems Integration for Major Projects).

Suggest Documents