Usability of information systems software in Pakistan

0 downloads 0 Views 4MB Size Report
Dec 31, 2016 - Head of Department/Assistant Professor, Department of Computer Science, .... is to make a permutation of the elements by swapping them to accomplish the higher ..... Net wrapper to the OpenCV image processing library.
EDITORIAL BOARD Editor in Chief 

Dr. Muhammad Imran Babar Head of Department/Assistant Professor Department of Computer Science, Army Public College of Management & Sciences, Rawalpindi, Pakistan. [email protected], [email protected] +92-51-8444555 Ext:138 +92-321-5890896

Co-Editor in Chief 



Dr. Masitah Ghazali Senior Lecturer, Department of Software Engineering, Faculty of Computing University Technology Malaysia, Skudai, Johor Bahru, Malaysia. [email protected] Dr. Dayang N.A. Jawawi Associate Professor, Department of Software Engineering, Faculty of Computing University Technology Malaysia, Skudai, Johor Bahru, Malaysia. [email protected]

Editors 









Dr. Rafa E. Al-Qutaish Associate Professor, Ecole de Technologie Superieure, Montreal, Quebec, Canada. [email protected] Dr. Zeljko Stojanov Assistant Professor, University of Novi Sad, Serbia. [email protected] Dr. Mustafa Bin Man Associate Professor, School of Informatics and Applied Mathematics, Universiti Malaysia Terenggnau, Kuala Terengganu, Malaysia. [email protected] Dr. Basit Shahzad Assistant Professor, King Saud University, Saudi Arabia. [email protected] , [email protected] Dr. Farukh Zeeshan Assistant Professor, COMSATS, Lahore, Pakistan. [email protected]









 









Dr. Muhammad Siraj Assistant Professor, College of Engineering, King Saud University, Saudi Arabia. [email protected] Dr. Khalid Mehmood Awan Assistant Professor, COMSATS, Attock, Pakistan. [email protected] Dr. Noreddine Gherabi Assistant Professor, University Hassan 1er, National School of Applied Sciences, Department of Computer Science, Settat, Morocco Dr. Abid Mehmood Assistant Professor, King Faisal University, Al-Hada, Saudi Arabia [email protected] Dr. Nadir Omer Fadi Elssied Hamed Assistant Professor, University of Khartoum, Sudan. Dr. Sheikh Muhammad Jehanzeb Assistant Professor, Army Public College of Managment & Sciences, Rawalpindi, Pakistan. [email protected] Dr. Ghufran Ullah Yousafzai Assistant Professor, Department of Computer Science, City University of Science & Information Technology, Peshawar, Pakistan. [email protected] Dr. Sim Hiew Moi Assistant Professor, Department of Computer Science, Southern University College, Johor Bahru, Malaysia. [email protected] Dr. Ashraf Osman AAU, Khartoum, Sudan. [email protected] Dr. Awad Ali Abdur Rehman Head of Department/Assistant Professor, Department of Computer Science, University of Kassala, Sudan. [email protected]

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Modification of RC4 algorithm to increase its security by using mathematical operations 1

Ali M. Sagheer, 2Sura M. Searan, 3Rawan A. Alsharida

1, 2

College of Computer Sciences and Information Technology, University of Anbar, Anbar, Iraq. 3 College of Computer Sciences and Mathematics, University of Tikrit, Tikrit, Iraq. Email: 1 [email protected], 2 [email protected], 3 [email protected]

ABSTRACT RC4 algorithm is one of the most widely used stream ciphers. It is fast, simple and suitable for software and hardware. It is used in many applications, but it has a weakness in the distribution of key stream bytes, the first few key stream bytes of PRNG are biased or related to some secret key bytes and thus the analysis of key stream bytes makes it possible to attack RC4, and there is a correlation between the key stream bytes that make it weak and breakable by single and double byte bias attack. This work shows a new algorithm proposed by using initial state factorial to solve the correlation issue between public known outputs of the internal state and making this algorithm is robust against attack by using an additional state table with the same length of the state to contain the factorial of initial state elements. Also, shows the single byte bias attack on RC4 by using the newly designed algorithm. The results showed that the proposed algorithm is robust against attack and could retrieve the first 32 bytes of the plain text by using the proposed algorithm of single byte bias attack with a probability of 100%. Additionally, the developed algorithm is robust against many attacks such as distinguishing attack. Keywords: RC4; KSA; PRGA; Single Byte Bias; Double Byte Bias; Single Byte Bias Attack; 1.

INTRODUCTION

Encryption is a process that accomplishes of transforming plaintext into ciphertext in order to hide its meaning and to prevent unauthorized parties from retrieving plaintext [1]. The cryptographic algorithms are designed to provide lower size, high speed of implementation, less complexity, and a larger degree of security for resource-constrained devices [2]. The strength of stream ciphers is the random key stream that guaranties secure computation of the cipher [3]. The cryptanalysis of stream cipher essentially focuses on identifying non-random proceeding [4]. When the key size is small, it must be very efficient and encryption time be very fast, many encryptions that are used in wireless devices are based on symmetric key encryption such as RC4 algorithm [5]. RC4 is an effective stream cipher algorithm that is most popular. It is used in Oracle, SQL, Secure Sockets Layer, and Wired Equivalent Privacy Protocol [6]. The attack on this algorithm was presented by Fluhrer, Mantin, and Shamir, it is an algorithm to use the symmetric key and it is an important one of the encryption algorithms [7]. This algorithm includes two main components to generate the key, the first is Key Scheduling Algorithm (KSA) and the other is Pseudo-Random Generation Algorithm (PRGA) [8]. RC4 starts with the permutation and uses the secret key with a variable length from 1 to 256 bits to preface a 256bit state table [9]. The key is limited to 40 bits because of missing of restrictions, but it is sometimes used as 128 key bits [10]. Symmetric encryption can be classified into stream and block ciphers [11]. RC4 is analyzed by different people and different weaknesses are detected [12]. KSA is more problematic and it aprepared to be simple [13]. At the beginning, few bytes of the output of PRNG are biased or related to some key bytes [14]. There are different types of attack that are classified by the amounts of information available to the adversary for cryptanalysis based on available resources [15]. The aim of this work is to solve interconnection between public known outputs of the internal state of RC4. 2.

LITERATURE REVIEW

Several researchers in information security analyzed RC4 algorithm based on its weakness and suggested different solutions, but the ways of bias calculations were slow, not efficient, and used a huge number of data. This section shows the previous studies related to this work: Mantin I. and Shamir A. (2001) showed an essential statistical weakness in the RC4 keystream by analyzing RC4 algorithm. This weakness makes it insignificant to discriminate between random strings and short outputs of RC4 by analyzing the second bytes. And observe that the second output byte of RC4 has a very strong bias that takes the value 0 with twice the expected likelihood (1/128 instead of 1/256 for n = 8). The main result is the detection of a slight distinguisher between the RC4 and random ciphers, that needs only two output words under many hundred unrelated and unknown keys to make robust decision [1]. Al-Fardan N.

43

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org J. et al. (2013) measured the security of RC4 in TLS and WPA and analyzed RC4 based on its single and double byte bias and attacking it based on its bias by using plaintext recovery attack. Their results show that there are biases in the first 256 bytes of the RC4 keystream that can be exploited by passive attacks to retrieve the plaintext by using 244 random keys [12]. Hammood M. M. et al. (2015) presented research to enhance RC4 security and speed. Many algorithms are proposed as development for RC4. The first is RRC4 (RC4-Random initial state), presented to make RC4 more secure by increasing its randomness. The second suggestion is RC4 with two state tables to increase the randomness in the key sequence and the execution time of RC4-2State faster than RC4. The last suggestion is RC42State + is to produce 4-keys in every cycle to improve the randomness in the key sequence. The output sequences of all suggested algorithms provide more randomness [13]. 3.

RC4 CONCEPT

Many of stream cipher algorithms are based on the use of Linear Feedback Shift Registers (LFSRs) particularly in the hardware, but the design of RC4 algorithm evades the use of LFSR [16]. This algorithm consists of two main components to generate the key, the first is Key Scheduling Algorithm (KSA) and the second is Pseudo-Random Generation Algorithm (PRGA) that is implemented sequentially [17]. KSA is more problematic, it was prepared to be simple. At the beginning, few bytes of the output of PRGA are biased or attached to some bytes of the secret key; therefore, analyzing these bytes makes them probable for attacking RC4 [7]. The internal permutation of RC4 is of N bytes, it is a key. The length of the private key is typical between 5 to 32 bytes and is recurrent to form the final key stream. KSA can produce initial Permutation of RC4 by scrambling the corresponding permutation using the key. This permutation (State) in KSA is used as an input to the second step (PRGA) that generates the final key stream [11]. RC4 starts with the permutation and uses a secret key to produce a random permutation with KSA. Based on a secret key, the next stage is PRGA that generates keystream bytes which XOR-ed with the original bytes to get the ciphertext [12]. The concept of RC4 is to make a permutation of the elements by swapping them to accomplish the higher randomness. RC4 algorithm has a variable length of key between (0-255) bytes to initialize the 256 bytes in the initial state array (State [0] to State [255]) [13]. The algorithms below show KSA and PRGA steps of the RC4 algorithm: Algorithm 1. KSA INPUT: Key OUTPUT: S 1. For (x = 0 to 255) 1.1 S[x] = x 2. Set y = 0 3. For (x = 0 to 255) 3.1 y = (y + S[x] + Key [x mod key-length]) mod 256 3.2 Swap(S[x], S[j]) 4. Output: S The second step is pseudo-random generation algorithm. It generates the output keystream Algorithm 2. PRGA INPUT: S, Plaintext x OUTPUT: Key sequence (K sequence) 1. Initialization: 1.1 x=0 1.2 y=0 2. For (x = 0 to Plaintext length) 2.1 x = (x + 1) mod N 2.2 y = (y + S[x]) mod N 2.3 Swap(S[x], S[y]) 2.4 K sequence = S [S[x] + S[y]] mod N 3. Output: K sequence The output sequence of key K is XOR-ed with the Plaintext Cx = Kx ⊕Plaintext x [4]

44

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 4.

THE WEAKNESS OF RC4

There are several weaknesses found in RC4 algorithm. Some of these weaknesses are easy and can be resolved, but other weaknesses are dangerous that attackers can exploit it. One of these weaknesses in initialization state is a statistical bias that occurs in distributing words of the first output [7]. The key stream beginning algorithm swaps the entry of the s-box exactly one time (identical to the pointer I that points to an entry) for low values, it is probable that SJ = j during the initialization [18]. Roos [19] also found RC4 weakness that is a high attachment between the first state table values and generated values of the key stream. The essential cause is the state table that began in series (0, 1, 2, …, 255) and at least one out of each 256 potential keys, the first generated byte of the key is highly attached with a few key bytes. Thus, the keys allow precursor of the first bytes from the output of PRGA. To reduce this problem, it was proposed to ignore the first bytes of the output of PRGA [19]. The goal of the attack is to retrieve the original key, the internal state, or the output keystream to have an access to the original messages. From the previous studies based on KSA and PRGA, there are some weaknesses of RC4 such as the biased bytes, distinguishers, key collisions, and key recovery from the state [20]. PRGA is reversible in nature. Then it is very simple to retrieve the secret key from the state [21]. Mantin and Shamir found the main weakness of RC4 in the second round. The likelihood of zero output bytes. Paul and Maitra found a private key by using the elementary state table and generated equations on the initial state bases and selected some of the secret key bytes on the basis of assumption and keep private key discovery by using the equation. Thus, the safeness of RC4 is based on a private key security and the internal states. Various attacks focus on getting the private key of the internal states [22]. The attack aims to retrieve the main key, internal state, or final key stream to access to the original messages [19]. 5.

THE MODIFIED RC4 BY USING FACTORIAL (RC4-FACT)

RC4 has many weaknesses in the KSA and PRGA that cause vulnerable to this algorithm. This section shows a new enhancement to the RC4 algorithm to improve it and to solve the weak key problem by using two arrays, one of them includes the factorial of other state contents with the same length to reduce the weakness that is exploited by the attacks. This algorithm consists of initialization step (KSA) and another step (PRGA) as shown in algorithms (3) and (4). All addition operations are implemented mod state length (N). The first step (KSA) takes a secret key k with a variable length between 1 and 256 n-bit words. In the first step of the KSA, one of the state tables filled by the factorial of the contents of the other state table that generated by the sender and filled by numbers from 0 to N-1. The input is the secret key used as a state table seed. After the KSA, the state becomes input to the next step (PRGA). In the PRGA step, additional operations used as permutation to the state table. This phase generates the keystream that X-OR-ed with the plaintext to get the ciphertext. Algorithm 3. KSA of Modified RC4-Fact INPUT: Key[x]. OUTPUT: S [x]. 1. For (x = 0 to 255) S[x] = x. 2. For (x=255 to 0) For (c = 1 to x) S_Fact[x] = (S_Fact[x] *c) mod 256 3. y=0 4. For (x = 0 to 255) 4.1 y = (S_Fact[x] + S[x] + Key[x mod key-length]) mod 256 4.2 Swap (S[x], S[y]) 5. Output: S [x]. The second is PRGA which generates the output keystream: Algorithm 4. PRGA of Modified RC4-Fact INPUT: S [x], Plain x. OUTPUT: Key sequence (Key seq.) 1. Set x = 0, y = 0 2. Output Generation loop 2.1 x = (x + 1) mod 256

45

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 2.2 y= (S [(y + S[x]) mod 256]) mod 256 2.3 Swap (S[x], (S-Fact [y] mod 256)) 2.4 Z = (S[(x + y) mod 256] + S[(y + S[S[x]]) mod 256]) mod 256 2.5 Key sequence = S [Z] 3. Output: Key sequence. Cipher x = Key seq. ⊕ Plain x. 6.

IMPLEMENTATION

This algorithm is executed by using C# language. The inputs to this algorithm are an initial state that is filled with the values from 0 to 255 and secret key with a length between 1 and 256, and another state table with 256 bytes to contain the factorial of initial state elements. The implementation of the proposed algorithm required less time than that required for implementation of RC4 when implemented on the same size of secret keys and showed that the proposed algorithm is faster than RC4 as shown below. Table. 1 Key generation time for RC4 and developed RC4-Fact. Key size

RC4 Time (m. s.)

RC4-Fact Time (m.s.)

1 kilobytes

4185

4091

2 kilobytes

4184

4110

3 kilobytes

4703

4191

5 kilobytes

6421

6295

key generation time for RC4 and RC4-Fact.

7000 Time in m.s.

6000 5000 RC4

4000

RC4-Fact

3000 2000 1000 0 1 kilobytes

2 kilobytes

3 kilobytes

5 kilobytes

key stream size

Figure. 1 Implementation Time of RC4 and Developed RC4 7.

RESULTS AND DISCUSSION

The produced key stream is examined by the NIST (National Institute of Standards and Technology) test suite that is a statistical combination for the random number generator test that is composed of 16 statistical tests for measuring the output series randomness of pseudo-random number or true random number generators [23]. The tests of this PRNG were done by using NIST STS-1.6. The likelihood of a good random number generator is represented by Pvalue. Some tests accepted large sequence sizes and failed in the small sequence size, and other tests accepted both large and small sizes. In our program, a large size (2,000,000 bits) is generated from each secret key. These sequences tested, and calculated p-values average result from these tests as shown in Table 2, in the test, P-value is compared to 0.01, the p-values are passed when it is greater than 0.01, and the produced series be random, and uniformly distributed. If the tests give p-value equal to 1, then the series taken to have complete randomness. A p-value of zero

46

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org means that the sequence is fully non-random. The SUCCESS means that the sequence is acceptable and it has good randomness, where FAILURE indicates that it is not acceptable and not-random. Table. 2 Result of running NIST on the generated key by RC4 and RC4-Fact. Test

Statistical Test Name

RC4

RC4-Fact.

No. P-VALUE

Conclusion

P-VALUE

Conclusion

1

Approximate Entropy

0.805578

PASS

0.189928

PASS

2

Block Frequency

0.742455

PASS

0.892817

PASS

3

Cumulative Sums (Forward)

0.739164

PASS

0.838256

PASS

4

Cumulative Sum (Reverse)

0.854066

PASS

0.808187

PASS

5

FFT

0.279715

PASS

0.574878

PASS

6

Frequency

0.898580

PASS

0.911358

PASS

7

Lempel-Ziv compression

0.889521

PASS

0.821626

PASS

8

Linear Complexity

0.407918

PASS

0.799356

PASS

9

Longest Runs

0.767817

PASS

0.909724

PASS

10

Non periodic Templates

0.5407084

PASS

0.608033

PASS

11

Overlapping Template

0.497550

PASS

0.626771

PASS

12

Random Excursions

0.528198

PASS

0.593594

PASS

13

Random Excursion Variant

0.525591

PASS

0.472056

PASS

14

Rank

0.610871

PASS

0.354270

PASS

15

Runs

0.115965

PASS

0.494192

PASS

16

Serial

0.646168

PASS

0.6234565

PASS

17

Universal Statistical

0.380374

PASS

0.520501

PASS

NIST Randomness test suite

1

P-Value

0.8 RC4

0.6 0.4

RC4-fact

0.2 0

Test Name

Figure. 2 NIST Statistical Test for RC4 and RC4-Fact.

47

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 8.

THE INTRODUCED SINGLE BYTE BIAS ATTACK ALGORITHM

Isobe et al. [24] suggested efficient plaintext recovery attacks on RC4 algorithm that can retrieve all bytes of the plain text from the ciphertexts in the broadcast setting when the same plaintext is encrypted with different keys. AlFardan et al. [12] and Hammood et al. [25] at the same time, used the same concept and determined the plaintext recovery attacks and applied it on single-byte bias attack on TLS. Their attack successfully recovered the first 256 bytes of keystream with likelihood roughly 1 from 224 ciphertexts encrypted with different random keys. This work determines a newly designed fast algorithm for calculating single byte bias attack on RC4 and retrieving the first 32 bytes of any plain text used, illustrated in the algorithm 5. The idea of this algorithm is based on the work of [12] and [25]. The concept of this algorithm aims to quarry the biases in the first 32 bytes of the RC4 keystream by finding the keystream value with the highest bias (Ki) in each position (i). The encryption of the same plaintext (P i) with various random and independent keys generates many ciphertexts (Ci) that used to detect the most duplicated byte in each position to use it as the bias that is shifted as the value of the plain text. The most duplicate appearing bytes of the keystream are XOR-ed with the most duplicate appearing bytes of ciphertext to retrieve the plain text. Algorithm 5. Single Byte Bias Attack Input: Key [k1, k2, …., k16], Plaintext x. Output: Plaintext*, Frequency of Plaintext*. 1. For (X = 1 to N), where N = 218, 221, or 224. 1.1. For (n = 1 to 234) Do 1.1.1. x = 0, y = 0 1.1.2. Call Algorithm 1: KSA 1.1.3. Call Algorithm 2: PRGA 1.1.4. Deducting new key with a length of 16 bytes from each generated key to be new secret key. 1.2. For (col = 0 to key Length) 1.2.1. For (row = 0 to 234) Set key [row] [col] as string 1.2.2. For (x = 1 to values.Count) If (values [x] = value) i. Increment count by 1 ii. Key position = col iii. Key value = value iv. Number of frequents = (count / (234 * 16)) 2. Calculate Max-Frequent [Key sequence i] of each position. 3. Ciphertext i = Encryption of Plaintext I with Key sequence i 4. For (X = 1 to N). 4.1. For (n = 1 to 234) Do 4.1.1. x = 0, y = 0 4.1.2. Call Algorithm 1: KSA 4.1.3. Call Algorithm 2: PRGA 4.1.4. Deducting new key with a length of 16 bytes from each generated key to be new secret key. 4.2. For (col = 0 to key Length) 4.2.1. For (row = 0 to 234) Set key [row] [col] as string 4.2.2. For (x = 1 to values.Count) If (values [x] = value) i. Increment count by 1 ii. Key position = col iii. Key value = value iv. Number of frequents = (count / (234 * 16)) 5. Calculate Max-Frequent [Cipher text i] of each position. 6. Plaintext*[X]= Encryption of Max-Frequent [Key-sequence i] with Max-Frequent [Cipher text i] 7. If Plaintext*[X] = Plaintext[X] Counter = Counter+1

48

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 8. Frequency of Plaintext * = (Counter * 100 /N) 9. Output: Plaintext *, Frequency of Plaintext *. The execution time of single byte bias attack algorithm for RC4 is fast and requires about 1 second when using 218 keys, and for RC4-Fact was 3 seconds so the proposed algorithm is difficult for the cryptanalyzer. This algorithm successfully retrieved the first 32 bytes of RC4 plain text while RC4-Fact is robust to this attack and couldn’t retrieve any of the plain text bytes. The recovery rate for RC4 and RC4-Fact is determined in Figures 3,4, and 5.

Single byte bias attack with 218 for RC4 and RC4-Fact 100

RC4 RC4 with factorial

90

Probability of Retrieving

80 70 60 50 40 30 20 10 0

0

5

10

15 20 Positions 0 ,1,2,....31

25

30

Figure. 3 The recovery rate for the first 32 position with 218 for RC4 and RC4-Fact. Single byte bias attack with 221 for RC4 and RC4-Fact 100

RC4 RC4 wit h fact orial

90

Probability of Retrieving

80 70 60 50 40 30 20 10 0

0

5

10

15 20 Positions 0 ,1,2,....31

25

30

Figure. 4 The recovery rate for the first 32 position with 2 21 for RC4 and RC4-Fact.

49

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Single byte bias attack with 224 for RC4 and RC4-Fact 100 90 80

Probability of Retrieving

70 60 50 40 30 20

RC4 RC4 with factorial

10 0

0

5

10

15 20 Positions 0 ,1,2,....31

25

30

Figure. 5 The recovery rate for the first 32 position with 2 24 for RC4 and RC4-Fact.

9.

CONCLUSION

RC4 stream cipher is a significant encryption algorithm and it is one of the widely used cryptosystems on the Internet that is used to keep information privacy. It is simple and fast in implementation but it has weaknesses in its key stream bytes that these bytes are biased to some different values of the private key. RC4 biases are now quarried for making practical attacks on TLS. In this work, new algorithm is proposed, it uses factorial of the state table contents and addition operations in KSA and PRGA to increase the randomness of the generated key while the key generation time of suggested algorithm is faster than key generation time of RC4. A new single byte bias attack algorithm is designed for attacking RC4 based on its single byte bias and retrieving all the first 32 bytes of RC4 plain text with the likelihood of 100% and the proposed algorithm is resistant to this attack. The generated key stream of the proposed RC4 has passed the NIST suite of statistical tests. Thus, it can be executed in the software or hardware. ACKNOWLEDGEMENT Authors give many thanks to the College of Computer Sciences and Information Technology, University of Anbar, and to the Assist. Prof Dr. Maitham M. Hammood for providing support in order to carry out this research. Moreover, special thanks to the project supervisor Prof. Dr. Ali M. Sagheer for supporting us in this research and project. REFERENCES 1. 2. 3. 4. 5.

Mantin, I. and A. Shamir. A practical attack on broadcast RC4. in International Workshop on Fast Software Encryption. 2001: Springer. Hammood, M.M., K. Yoshigoe, and A.M. Sagheer, RC4-2S: RC4 stream cipher with two state tables, in Information Technology Convergence. 2013, Springer. p. 13-20. Sepehrdad, P., S. Vaudenay, and M. Vuagnoux. Discovery and exploitation of new biases in RC4. in International Workshop on Selected Areas in Cryptography. 2010: Springer. McKague, M., Design and analysis of RC4-like stream ciphers. 2005. Prasithsangaree, P. and P. Krishnamurthy. Analysis of energy consumption of RC4 and AES algorithms in wireless LANs. in Global Telecommunications Conference, 2003. GLOBECOM'03. IEEE. 2003: IEEE.

50

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16.

17. 18. 19. 20. 21. 22.

23. 24. 25.

Rahma, A.-M.S. and L.D.A.M. Sagheer, DEVELOPMENT OF RC4 STREAM CIPHERS USING BOOLEAN FUNCTIONS. Stošić, L. and M. Bogdanović, RC4 stream cipher and possible attacks on WEP. Editorial Preface, 2012. 3(3). Maitra, S. and G. Paul. New form of permutation bias and secret key leakage in keystream bytes of RC4. in International Workshop on Fast Software Encryption. 2008: Springer. Maitra, S. and G. Paul. Analysis of RC4 and proposal of additional layers for better security margin. in International Conference on Cryptology in India. 2008: Springer. Garman, C., K.G. Paterson, and T. Van der Merwe. Attacks only get better: Password recovery attacks against RC4 in TLS. in 24th USENIX Security Symposium (USENIX Security 15). 2015. Paul, S. and B. Preneel. Analysis of non-fortuitous predictive states of the RC4 keystream generator. in International Conference on Cryptology in India. 2003: Springer. AlFardan, N., et al. On the security of RC4 in TLS. in Presented as part of the 22nd USENIX Security Symposium (USENIX Security 13). 2013. Hammood, M.M. and K.Y.a.A.M. Sagheer, Enhancing Security and Speed of RC4. Int. J. Com. Net. Teach, 2015. 3(2). Khine, L.L., A New Variant of RC4 Stream Cipher. Assessment, 2009. 88: p. 3138. Bokhari, M., S. Alam, and F.S. Masoodi, Cryptanalysis techniques for stream cipher: a survey. International Journal of Computer Applications, 2012. 60(9). Wong, K.K.-H., G. Carter, and E. Dawson. An analysis of the RC4 family of stream ciphers against algebraic attacks. in Proceedings of the Eighth Australasian Conference on Information Security-Volume 105. 2010: Australian Computer Society, Inc. Orumiehchiha, M.A., et al. Cryptanalysis of RC4 (n, m) Stream Cipher. in Proceedings of the 6th International Conference on Security of Information and Networks. 2013: ACM. Mister, S. and S.E. Tavares. Cryptanalysis of RC4-like Ciphers. in International Workshop on Selected Areas in Cryptography. 1998: Springer. Roos, A., A class of weak keys in the RC4 stream cipher. 1995, September. Jindal, P. and B. Singh. Performance analysis of modified RC4 encryption algorithm. in Recent Advances and Innovations in Engineering (ICRAIE), 2014. 2014: IEEE. Billet, M.R.O., New stream cipher designs. 2008: Springer. Pardeep and K. Pushpendra, PC1-RC4 and PC2-RC4 algorithms: Pragmatic enrichment algorithms to enhance RC4 stream cipher algorithm. International Journal of Computer Science and Network (IJCSN), 2012. 1(3): p. 98-108. Fluhrer, S.R. and D.A. McGrew. Statistical analysis of the alleged RC4 keystream generator. in International Workshop on Fast Software Encryption. 2000: Springer. Isobe, T., et al. Full plaintext recovery attack on broadcast RC4. in International Workshop on Fast Software Encryption. 2013: Springer. Hammood, M.M. and K. Yoshigoe. Previously overlooked bias signatures for RC4. in 2016 4th International Symposium on Digital Forensic and Security (ISDFS). 2016: IEEE.

AUTHOR PROFILE Ali M. Sagheer is a Professor in the Computer College at Al-Anbar University. He received his B.Sc. of Information System (2001), M.Sc. in Data Security (2004), and his Ph.D. in Computer Science (2007) from the University of Technology, Baghdad, Iraq. He is interested in the following Fields; Cryptology, Information Security, Number Theory, Multimedia Compression, Image Processing, Coding Systems, and Artificial Intelligence. He has published many papers in different conferences and scientific journals. Sura M. Searan has received her B.Sc. of Computer Science (2013) from the University of Anbar, Iraq. She is a master student (2014, till now) in the Computer Science Department, College of Computer Sciences and Information Technology at Al-Anbar University. She is interested in the following Fields; Cryptology, Information Security, Coding Systems.

51

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org Rawan A. Alsharida is an assistant lecturer in the department of computer science at University of Tikrit, she received Master of Information Science, information Quality in (2015), University of Arkansas at Little Rock, USA. From 2005 she was Bachelor of Computer Science, University of Tikrit, Tikrit, Iraq.

52

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Skin based real time air writing using web camera 1

Snober Naseer, 2Tayba Farooqui, 3Muhammed Jehanzeb, 4Sundus Razzaq Department of Computer Sciences, Fatima Jinnah Women University, Rawalpindi, Pakistan 3 Department of Computer Sciences, Army Public College of Management & Sciences Rawalpindi, Pakistan Email: [email protected],{taybafarooqui, sundusrazaq}@gmail.com, [email protected] 1,2,4

ABSTRACT Skin detection is one of the most essential and primary stages in image processing applications such as face detection, hand detection and hand tracking. In proposed system, hand segmentation using color models is introduced for detecting user’s hand by skin based segmentation technique for writing alphabets according to the movement of hand. The proposed system lets the user to write alphabets using hand detection based on skin and works efficiently in faster, robust, accurate and real-time applications. Keywords: Hand Segmentation; Color Spaces; Hand detection; Morphological operations; Contour; Convex Hull 1.

INTRODUCTION

In today’s high-tech era, numerous technologies are progressing day by day. One such encouraging concept is Human- Machine Interface. One of the most essential considerations to make system efficient, accurate and reliable is Human Computer Interaction (HCI). There are several systems dealing with simple techniques for HCI, these techniques ensure the use of mouse; keyboard etc. for input to the systems. Most commonly used techniques are physically in contact with the system recently new techniques have been developed to make Human Computer Communication is more efficient. One of the supreme gifts of nature to the mankind is the capability to express him by reacting to the actions occurring in his surroundings. Motions are considered as the extreme natural expressive way for interaction between human and computers in virtual system. The task of hand movement recognition is one of the important problems in computer vision. With recent development in information technology, human interactions systems are build which involve tasks related to hand processing like hand detection, hand recognition and hand tracking. We wish to make a Window-based application for live skin based hand detection and recognition using webcam input in C#. This project is a combination of live hand movement detection and face identification. This application uses the webcam to detect the hand and face made by the user and perform mapping of alphabet according to hand motion. The user has to perform a particular movement of hand in order to form a single alphabet. The webcam captures this and identifies the alphabet, recognizes it (against a set of known alphabets) and performs the action corresponding to it. An alphabet could be written even while sitting afar from the computer screen. This paper presents an interactive technique for Human Machine Interaction using the hand detection based on skin. The organization of the paper is as follows. In section 2, Literature review has presented. In section 3, Technologies used in proposed system are described. The description of proposed system is described in section 4. Implementation phase in which techniques used for Segmentation of hand and hand detection are introduced and discussed in section 5. Experimental Results and conclusion is discussed in section 6. In section 7, future work has been discussed. 2.

LITERATURE REVIEW

Lots of works have been done on hand movement recognition using different techniques. This section presents a quick review on some of the previous work done. Many researchers have invented techniques to deal with the various hand movement and features for getting more accuracy and more recognition rate. The Kalman filter is used for hand tracking to obtain motion of hand region. To recognize real time hand movements in unconstrained environments, Kalman filter offered a hand gesture recognition system [1]. Meenakshi Panwar [2] proposed approach that uses some pre-processing steps for removal of background noise. Mohan Pradhana [3] invented a very simple and efficient approach for recognizing the hand movements that represents numbers from zero to nine. The design of above discussed system is divided into two phase namely, preprocessing and classification phase. But the negative side of using this approach is that the planned system is able to organize only the static images. In reference [7], Amardip et al have used a web camera, hand glove and image subtraction algorithm to map mouse inputs with hand movement. Different fingers of hand glove have different colors that make system faster and easier. Gowrishankar. R [8], have used web camera and different techniques such as image recognition, color

53

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org recognition process and Sixth Sense Technology (is a set of wearable devices that acts as a gestural interface) to control the mouse movement with finger. Our project is inspired by work of Lianwen Jin [9], where he used FVCR system, to write character with hand movement without use of any additional device. A robust fingertip detection algorithm, image segmentation techniques and recognition of fingertip algorithm used to write the character. [10] Akash. S used sixed sense technology to command to computer using hand gesture. He also used pendant like wearable device to virtually paint and write without using any pen or physical equipment and web camera to detect or capture hand movement. Kamran Niyazi [11] et al used Web camera and color tapes. Camera detects color tapes for cursor movement. Distance between two colored tapes in the fingers can be calculated to perform clicking actions. Our work inspires by these technologies that kid can interact with the system to write the alphabet without using any glove or sensor our system work with normal web camera in real time. The previously researchers works in this area by using gloves and sensor based cameras to perform the activities our system is independent with reference to sensor and gloves. 3.

TECHNOLOGIES USED a) Emgu CV

Emgu CV is a cross platform .Net wrapper to the OpenCV image processing library. It allows OpenCV functions to be called from .NET compatible languages such as C#. The wrapper can be compiled by Visual Studio. It runs on Windows, Linux, Mac OS X, iOS, Android and Windows Phone. b) C# C# (pronounced as see sharp) is a general-purpose, object oriented, interoperability, scalable, updateable, efficient and easy to use programming language based according to the current. We have used the C# interface of EmguCV for the implementation. C# permits programmers to quickly and easily build solutions for the Microsoft .NET platform. It also supports Data Encapsulation, interfaces and encourage the programmer to build robust applications. 4.

PROPOSED SYSTEM DESCRIPTION

The proposed system has three phases. In real time, input image has captured using a web camera is done in the first phase. The hand region based on skin color is detected using color space models and morphological operations are performed in the second phase of the proposed system. In the feature extraction phase, contours and the convex hull method is used to detect the border line of the segmented binary hand image. Finally system starts writing the alphabets keeping the (x,y) coordinates value according to the hand movement. The experimental setup of writing alphabets using skin based hand detection requires a low cost web cam and prefers a plain background with black screen to get accurate results. It does not require any special equipment such as colour data gloves. Following are the steps in our approach: i. Capture the image frames from web camera. ii. Segment the hand portion from Image frames. iii. Apply dilation and erosion operations to eliminate noise from the output obtained from hand segmentation. iv. Find Contours. v. Detect the convex hull of a contour and calculating centroid of the hand. vi. Tracking the pointer using the coordinates obtained from the centroid to perform writing actions according to the hand movement to write the alphabets.

Figure. 1 Block diagram of the system

54

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 5.

IMPLEMENTATION a.

Webcam

A webcam is a video camera that streams its image in real time through a computer. Sensitivity of mouse is related to resolution of camera. An enhanced user experience is guaranteed only when the camera resolution is good. The webcam serves the purpose of taking real time images whenever the computer starts. System will take input from the webcam and converting it into a form that can be processed easily. Our system will decide the respective action on the basis of hand movement. After that system will map the movement of hand on templates of alphabets from the input of the webcam. b. Hand Segmentation Hand segmentation is done by separating the user’s hand from the background in the image. Segmentation can be done using several methods. An important step in hand segmentation is thresholding which is used to separate the hand from the background. Various methods are used to select the appropriate threshold value to obtain better result explained in [4]. On the input image, thresholding is done according to a threshold value. 0 is set to those Pixels that are having intensity less than the threshold value and 1 is set to those Pixels that are having intensity more than the threshold value. Thus a binary image where pixels 0 represent the background and pixels 1 represent the hand is the output of thresholding. Therefore, pixels having value 1 is the user’s hand. There are some constraints on the background to extract hand in order to avoid much noise.

Figure. 2 Blob of user’s hand The detected white blob can also be some other object instead of user’s hand so whether the detected object is a hand or not is decided by the hand detection part explained further. User’s hand is segmented according to the intensity of the pixels used. Usually intensity of hand is much higher, so the hand will easily be segmented by keeping the background dark. Incremental Thresholding Value is one of the segmentation methods which can automatically select the threshold value. In this method, the function used in Static Thresholding is used. But in case of Incremental Thresholding Value the thresholding value keeps on incrementing, that is the variable thresh is incremented by value ‘1’ in a range from 20 to 160, till we detect only one contour in the input image. It also ensures that the whole hand is detected as a blob without any internal fragmentation. One important factor that should be considered for color is the choice of a right color space. This paper proposes a hand segmentation method based on RGB, HSV, and YCbCr color space. c.

Color Spaces

There are different color spaces depending on different color expression ways [5, 6]. There are different hand skin color ranges in different color spaces. Selecting appropriate color space is the key stage for hand segmentation process. d. RGB Color Space Model RGB color space is a kind of mixed color space. It describes color space using primary colors.   

Red Green Blue

55

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org It can also represent many colors. RGB color space is not used in most experiments because it is difficult to digitize the information and mixes hue, luminance and saturation together.

Figure. 3 RGB color model e.

HSV Color Space Model

HSV is often called HSB (B for brightness). The HSV color space model describes colors in terms of following:   

Hue Saturation lightness (luminance)

HSV is the most common cylindrical-coordinate representations of points in an RGB color model. HSV model is used in computer vision and analysing the image for segmentation. The diagram below shows a HSV Model

Figure. 4 HSV model f.

YCbCr Color Space Model

Information about luminance is stored as a single component (Y), and Information about chrominance is stored as two color-difference components (Cb and Cr) where Cb represents the difference between the blue component and a reference value. Cr represents the difference between the red component and a reference value. In short, Y′ is the luma component and CB and CR are the blue-difference and red-difference chroma components.

56

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure. 5 RGB Colors cube in the YCbCr space 6.

HAND SKIN DETECTION a.

Morphology Operations

The binary image formed in the previous section may contain white pixels at background (non-skin region). It may happen due to the background color resemble the skin color at hand region. Bad lighting conditions or existing pixels similar to skin pixels in those regions may cause these noises. Further implementation of morphological operations is required to fill up the black pixels on the segmented hand and white pixels on background so that hand can be clearly detected. Two morphological operations are encompassed namely dilation and erosion. Firstly, dilation operation is done which adds pixels to fill up missing pixels in hand region. Secondly, erosion operation is performed to remove white pixels which do not belong to the hand region. These morphological operations are performed to improve the result obtained from hand segmentation. b. Contours A contour is the curve for a two variables function along which the function has a constant value. It joins points above a given level and of equal height. A contour map illustrates the contour using contour lines. The contour lines show the steepness of slopes and hills. Contours are curves describing the intersection of one or more horizontal planes with a real or hypothetical surface with. Contour lines are curved, describing straight or a mixture of lines describing the intersection of a real or hypothetical surface with one or more horizontal planes. The contour is tired around the hand that is formed by thresholding the input image. White blob is found out after pre-processing of the image frame. Contour is drawn around the hand (white blob).

Figure. 6 Contouring of hand

57

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org The following function is used for finding contours in EmguCV using C# language: Public FindContours(CHAIN_APPROX_METHOD method, RETR_TYPE type, MemStoragestor) method- contour approximation method is capable of compressing vertical, horizontal and diagonal segments and stores only their end points. type- contour retrieval mode stor- the memory storage used by the sequences. c.

Convex Hull

The convex hull of a set of points in the Euclidean space (a set of points satisfy certain relationships, expressible in terms of distance and angle) is the smallest set of convex that contains all the set of given points. For example, the convex hull is visualized as the shape made by a rubber band stretched around this set of points, when this set of points is a bounded subset of the plane. Convex hull is drawn around the contour of the hand; it can be visualized as contour points are within the convex hull. This makes a wrapper around the hand contour.

Figure. 7 Convex hull of hand The following function is used for detecting the convex hull of a contour: GetConvexHull(Emgu.CV.CvEnum.ORIENTATION.CV_CLOCKWISE); Contours and convex hulls are collection of points needs to be connected with straight lines.

Figure. 8 Templates of alphabets 7.

EXPERIMENTAL RESULTS

The proposed work is implemented in Microsoft Visual Studio 2015 with EmguCV libraries and skin based hand detection is tested with different people having different hand size under background constraints. Each person shows hand movements according to the alphabet in front of the camera. Experiments show that the response time of writing an alphabet through hand movements is comparable to the time for selecting a key from a keyboard for typing an alphabet as it does not require a database of training images.

58

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure. 9 Output of the test result contouring of hand

Figure. 10 Output of the convex hull showing middle point of hand

Figure. 11 The strokes direction in template 8.

FUTURE WORK

The proposed system does not have a good background subtraction method which puts constraints on the user for successful working of the system. In future work, we wish to include those methods that assist to reduce these constraints so that the system is usable in any situation and produce better, precise and more accurate results. 9.

CONCLUSION

This project completely eradicates the need of mouse. Also this would lead us to a new era of Human Computer Interaction (HCI) where no physical contact with the machine is required. The proposed system technology can be used to develop novel educational learning systems which assist children to learn handwriting in an interesting way. ACKNOWLEDGEMENT Special thanks to Fatima Jinnah Women University, Rawalpindi, Pakistan and Army Public College of Management & Sciences for providing resources in order to complete this research.

59

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org REFERENCES 1. 2. 3. 4. 5. 6.

7.

8. 9.

10. 11.

Pawan Singh Mehra, “Hand Gesture Recognition for Human Computer Interaction”, IEEE International Conference on Image Information Processing (ICIIP 2011), Waknaghat, India, Nov 2011. Meenakshi Panwar, “Hand Gesture based Interface for Aiding visually Impaired”, IEEE International Conference on Image Information Processing (ICIIP 2012), Noida, India. Sonal Singhai,“Hand Segmentation for Hand Gesture Recognition”,International Journal of Innovative Research in Information Security (IJIRIS) ISSN: 2349-7017(O),Volume 1, Issue 2 (August 2014) Mohan Pradhana “A Hand Gesture Recognition using Feature Extraction” International Journal of Current Engineering and Technology ISSN 2277 – 4106 13 -Nov. 2012. Thresholding:http://www.cse.unr.edu/~bebis/CS791E/Notes/Thresholding.pdf H. Duan, “A Method of Gesture Segmentation Based on Skin Color and Background Difference Method”, Proceedings of the 2nd International Conference on Computer Science and Electronics Engineering, (2013) March 22-23, Hangzhou, China. Alper Aksaç, “Real-time Multi-Objective Hand Posture/Gesture Recognition by Using Distance Classifiers and Finite State Machine for Virtual Mouse Operations”, Scientific and Technical Research Council of Turkey, 2010 Bousaaid Mourad, “Hand gesture detection and recognition in cyber presence interactive system for EIearning”, IEEE, 2014 Nabil Belhaj Khalifa, “REAL-TIME HUMAN-COMPUTER INTERACTION BASED ON FACE AND HAND GESTURE RECOGNITION”, International Journal in Foundations of Computer Science & Technology (IJFCST), Vol.4, July 2014 Nilkanth. B. Chopade, “A Review on Hand Gesture Recognition System”, International Conference on Computing Communication Control and Automation, 2015 Ms. Anagha Dhote, “Hand Tracking and Gesture Recognition”, International Conference on Pervasive Computing (ICPC), 2015

AUTHOR PROFILES Dr. Muhammed Jehanzeb received his B.Sc. degree in Computer Science from Arid Agriculture University, Pakistan in 2005, and M.S. in Computer Science from Iqra University, Pakistan, in 2007. He earned Ph.D. degree in Computer Science from Universiti Teknologi Malaysia (UTM), Malaysia in 2014. Presently, he is working as Assistant Professor at the Department of Computer Science in Army Public College of Management & Sciences (APCOMS), Pakistan. His research interests include data analysis, document analysis and recognition, pattern recognition, image analysis and classification.

60

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Knowledge mapping and multi-software agents based system for risk mitigation in IT organizations 1

Bokolo Anthony Jnr, 2Noraini Che Pa, 3Rozi Nor Haizan Nor, 4Yusmadi Yah Josoh 1,2,3,4

Department of Software Engineering and Information Systems, Faculty of Computer Science and Information Technology, Universiti Putra Malaysia, 43400 UPM, Serdang, Selangor, Malaysia Email: [email protected] ABSTRACT As the reliance on information technology (IT) in running organizations services is increasing, so is the exposure to the associated risks due to IT use. In IT organization rules, policies and procedures are put in place by management in organizations to ensure the IT Infrastructures (Hardware, Software and Network Communication devices) are used effectively to achieve the objectives of the organization. In order to achieve the objectives, set by management, risks need to be mitigated effectively and adequately. Risk mitigation involves the risk identification, risk decision, risk treatment and risk monitoring and risk report. However, available risk mitigation tools/systems present many weaknesses and above all, they are limited. The objective of this paper is to present a risk mitigation system (RMS) tool for risk decision in mitigating risk that occurs in IT organization. The RMS is developed based on the risk decision process, for mitigating risk, that has been developed in this research. The tool is developed using multi-software agents and knowledge mapping. Findings from this paper shows how the tool can support mitigation of risk thus providing decision support for IT managers, and IT practitioners in their organization. Keywords: Risk; risk decisions; risk mitigation; software agents; knowledge mapping 1.

INTRODUCTION

Risk is considered as something that might go wrong in an establishing process. Risk is also a combination of the likelihood of an event and its effects. In IT organization management s must learn to stabilize the possible negative effects of risk against the possible gains of its related opportunity [1]. Risk mitigation emphasizes taking action early in IT organization to prevent the occurrence of undesired events or to reduce the consequences of their occurrence. These mitigation actions should be appropriately planned, and such plans should include estimating and planning the schedule, resources, and funding for mitigation. Once a risk is identified, mitigation actions should be identified. But since most risks can be mitigated in quite a few different ways, each of which may require different resources at different times. Therefore, selecting the best mitigation action is not an easy task for practitioners [2]. Risk mitigation involves risk identification, risk decision, risk treatment and risk monitoring. Risk mitigation is an essential component of IT organization and plays a significant role in IT organization success. IT organization supports business operations, adding value through IT component and risks mitigation. The purpose of IT organization is to direct IT endeavors to ensure that IT performance meets the objectives set out in its strategy. With effective IT organization, the return of IT investment can be optimized to support business strategies and goal [1]. Risks can be mitigated with the correct organizational decision-making structures and the assignment of roles and responsibilities. Decision making is a process by which a person, group, or organization identifies a choice or judgment to be made, gathers and evaluates information about alternatives. This definition implies that decision-making involves risks in selecting one from several courses of action, which is usually compounded by time and information constraints. In IT organization, decision making relating to risk mitigation is perceived by the top management as a technical problem [3]. To mitigate risk, management must have effective means for mitigating operational, technical and strategic risks. IT organizations faces operational, technical and strategic risks that prevent practitioners in attaining their planned schedule, time and quality in when using IT Infrastructures (Hardware, Software and Network Communication devices). As a result, risk mitigation procedures put in place in organizations tend to be inadequate. Furthermore, the traditional way of mitigating risks by transferring them to insurance companies is not working effectively, as it is difficult to estimate the consequences due to IT-related risks. Existing available tools for risk mitigation are mainly designed for risk analysis and evaluation [4]. Several risk tools lacked capabilities to support IT managers in IT organization environment to mitigate risks that occurs with usage of IT infrastructures. Currently there are only risk mitigation standards and guidelines and inadequate risk mitigation tools to provide support for mitigation operational,

61

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org technical and strategic risk that occur in IT organization. The existing tools are lacking in capturing and reuse of lessons learnt from previous events, case studies and best practices, to utilize and disseminate the previous as well as existing knowledge and experience within the practitioners and decision makers in IT organization. These tools also lack in assisting IT managers, practitioners and decision makers in identifying the cause of risks experienced when utilizing IT infrastructures [5]. Therefore, there is a need for a tool which can assist in solving operational, technical and strategic risks and thereby creating a common understanding of these risks among technical people and the management within IT organization. With a common understanding, it would be possible to realize a coordinated approach towards risk mitigation in IT organization [6]. Risk mitigation tool is also needed to assist practitioners in making decision in risk mitigation. The implemented Risk Mitigation System (RMS) automatically identifies the operational and technical risk data directly from knowledge base, establishes risk description according to risk measurement schemes, generates and display risk report for decision making. RMS uses an agent-based system that supports risk mitigation as an iterative and continuous process across large-scale collaborative teams in IT organization. RMS assist in identification of risk, making decision on risk, provide solution on risk treatment and assist IT practitioners in monitoring identified risks. RMS also measures the identified risk using agents to quantify the risk probability and risk impact. The measuring agent in RMS measure the risk by assigning different color to the risk probability and impact based on the risk severity and impact to IT organization. The RMS tool aids the knowledge retrieving, storing, sharing process by mapping knowledge in the knowledgebase. The rest of this paper is organized as follows. Section 2 describes literature review. Section 3 describes the methodology. Section 4 describes the risk mitigation practice. Section 5 shows the RMS implementation. Section 6 presents the discussion and conclusions section. 2.

LITERATURE REVIEW

This section briefly explores on the concepts of risk, importance of risk mitigation in IT organization and the need for decision making in mitigating risk in IT organizations. a) Overview of risk in IT organizations Risk can be normally an unseen occurrence. Risk is also a combination of the likelihood of an incident and its effects [7]. Risk in itself is not bad; risk is crucial to develop, and failure is normally a key part of knowledge. But management must learn to treat the possible negative effects of risk against the possible gains of its related opportunity [8]. Risk occurs when IT infrastructures are used to accomplish the aims and objectives of IT organization. In IT organization, the lack of open communication, forward-looking attitude, team involvement in the practitioners, management and the knowledge of typical problems, exposes IT infrastructures (hardware, software, network devices and other peripherals) to operational and technical risk. Risk process in IT organization starts with the identification of existing operational and technical risks. Once the risks are identified, they are evaluated and measured and then appropriate mitigation actions are planned and executed. To reach the acceptable level of success guarantee, the introduced actions must be controlled and the risks in mitigation must be continuously monitored for their status [9]. b) Importance of risk mitigation in IT organizations In risk mitigation, the practitioner and management perspective is included in the treatment of IT risks by identifying, evaluating, minimizing, and controlling potential IT risks in IT organization process [10]. Risk mitigation is a good practice and can assist with meeting a range of compliance, statutory, organizational and governance requirements. By effectively mitigating the risks IT organization faces, management can guard against poor decision making. According to [11] mitigation of risk provides a mechanism for managers to handle risk effectively by providing the step wise execution of the risk handling methodology, presenting the easy to understand, flowcharts to express the working of each mitigation strategy against any risk factors in IT organization. In a research study [12], it is stated that the mitigation of risks aids managers to understand the mutual relationships among the enablers of risks mitigation and provides a suitable metric to quantify these risks. Thus, managers are provided with an opportunity to understand the focal areas that needs attention to minimize the risks to

62

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org the real time and free flow of information. This would help the decision makers to estimate the impacts of various risks and consequently develop suitable strategies to counter them. The research study [13] contributed that in risk mitigation decisions will be performed in order to have an efficient decision in the mitigation of identified risks, meaning that mitigation of risk aids the management in making decision in IT organization. Risk mitigation facilitates the development of comprehensive IT organization plan by focusing on the unseen risks and opportunities accompanying with the risk mitigation decisions, which are basically ignored in IT organization process, thus risk mitigation is important in order to make an effective decision, regarding the identified risks. The research study [14] affirms that risk mitigation provides a disciplinary environment for proactive decision making to assess continuously what could go wrong, determine which risks are important to deal with and implement strategies to deal with those risks. They maintained that risk mitigation provides a structural method to conduct risk investigation and make sure that there is less redundancy in IT organization. The research study [15] pointed out that risk mitigation emphasizes on taking action early in a project to prevent the occurrence of undesired events or to reduce the consequences of their occurrence. These mitigation actions help project managers to appropriately planned, and such plans mitigation plans include estimating and planning the schedule, resources, and funding for mitigation in optimization of project cost and schedule. The research study [16] reports that risk mitigation stabilizes the requirements, designs and implementations in software development, also an integrated risk mitigation plan aids the risk manager in carrying out his core responsibility as well as a main concern in IT organization. The research study [17] stated that risk mitigation is important because it focuses on identifying strategic and tactical approaches to minimize the negative impacts of the identified risks to the systems overall functionality, reliability, performance, and maintainability. Lastly, the research study [18] added that risk mitigation is important to IT organization because it provides managers with an effective tool to make the risk control decisions and implement the process optimization at the project planning stage. c)

Related works

IT practitioners need to ensure delivery of IT requirement when using IT infrastructures. This usually involves mitigating the operational and technical risks that occurs. Research related to risk mitigation has attracted steady stream of interest in the academic literature, in spite of scholars and practitioners recognizing the risk mitigation models in IT projects. Insufficient attention has been paid by researchers to select a suitable risk mitigation model or tool. This section attempts to address this limitation and the gap in the current literature and later provide a model and tool for mitigating risk in subsequent sections. The research in [19] defines a method consisting of risk evaluation/risk mitigation and a tool for carrying out risk mitigation activity by assessing the global impact of a set of risks and to choose the best set of countermeasures to cope with them. The tool provided a more precise risk mitigation activity. The tool is criticized for not having a knowledgebase to save risk mitigation activities and results. [20] presented a model and prototype developed to identify, estimate, document, assess, prioritize, monitor, control and displays statistics of the risk in a less complex interface, easy-to-use, efficiency and less time consuming risk tool. The tool can only be used as an application and cannot be assessed on the web. [6] designed a model and EMITL tool consisting of state risk; risk analyze, risk findings and risk counter measures. The tool assist in mitigating IT risks based on risk exposure and setting countermeasures for the risk. The tool only helps in quantifying IT related risk for management and technical department. The research in [5] developed an intelligent risk mapping and assessment system model and tool (IRMAS) involving risk identification, risk impact and weight analysis, event drivers’ identification, risk probability and magnitude analysis, risk mitigation and risk management plan. The tool is designed to identify, prioritize, analyze and assist project managers to mitigate perceived sources of risks. [21] presented a model and tool Analyzer for Reducing Module Operational Risk (AMOR) consisting of risk metrics, risk Modelling, evaluation, risk assessment, risk prediction, display risk result for decision making, however, the complete functionality of ARMOR is not yet implemented. [3] developed a web-based system based on their proposed model. The model consists of risk identification, analysis, evaluation, response and monitoring, which enabled users almost anywhere to perform continuous risk mitigation for each stage of decision processes. The researchers quantified

63

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org the risk in a systematic manner such that the causal relationships between risks can be understood by the decision makers. [4] developed a model and an agent based risk tool. The model process involves context establishment, risk identification, risk impact and probability analysis and risk mitigation. The researchers developed a web-based system that supports risk mitigation as an iterative and continuous process across collaborative teams. [22] propose RIAP (Risk Identification Architecture Pattern) model to manage risks in Web projects. It has been implemented in a tool called WPRiMA (Web Project Risk Management Assessment). WPRiMA support IT process to estimate and mitigate risk. [23] proposed a web-based application that is implemented based on the functional risk classification and assessment model. The tool is useful for making decisions to mitigate IT risks by incorporating a different scale, estimating risk probabilities from historical data and providing more flexibility to the Decision Making. [24] proposed a risk assessment model and hospital information system (HIS) tool for mitigating risk when there is no sufficient data. The tool integrates possible risk factors into the decision-making process of risk assessment. In the model, a reality-design gap analysis is used to determine risk likelihood instead of directly risk evaluation and integrates to the decision-making process for risk mitigation. [9] develop a collaboration model that helps to identify, analysis, plan, track, control and communication risk mitigation. Showing the roles and communication paths among the teams to support risk mitigation activities as well as supporting the collaboration among risk teams. The researchers then present a tool called Risk-Guide that supports the risk mitigation using checklists together with qualitative risk evaluation. d) Knowledge mapping and multi-software agents This section explores on knowledge mapping and agent technology. Knowledge is information in action in which data are extracted and transformed into information, and load them into databases. Having acquired information from data and storing it in a data warehouse, information need to be transformed into knowledge [25]. As practitioners try to mitigate risk in IT organization, there is need for an innovation to produce the knowledge intensive services desired by decision makers. Knowledge mapping in risk mitigation context is in its infancy and has the potential to address both operational and technical risk faced in IT organization. Knowledge map is a picture of what exists in an organization or a network. Therefore, it can be used as a tool to mitigate risk in IT organization. Knowledge mapping is the field within knowledge management that aims to optimize the efficient and effective use of the organization’s knowledge. Developing a knowledge map involves locating important knowledge within the organization and then publishing some sort of list or picture that shows where to find it [26]. Knowledge maps typically point to people as well as to documents and databases [27]. Knowledge mapping is one way that allows knowledge to be represented graphically through nodes to represent main ideas and links leading to representing the relationships between the ideas. Knowledge map is the best tool to represent knowledge in an organization [28] The elements which are mapped onto such a shared context range from experts, practitioners, or communities of practice to more explicit and codified forms of knowledge, such as white papers or articles, patents, lessons learned, or databases [26]. In information technology, an agent is software that acts or brings about a certain result; it is one who is empowered to act for another [29]. Agent can be defined as a software entity, which is autonomous to accomplish its design objectives, considered as a part of an overall objective, through the axiom of communication and coordination with other agents [30]. According to [31] in software system, agent means software component that can interact autonomously as a substitute for its user with its environment and other agents to achieve the predefined goal [31]. [30] stated that agent based technology is acknowledged as one of the most promising technologies for effective mitigation of risk. Moreover, agents can be integrated in risk mitigation model in IT organization to forecast risky situations. Agents can work together to create models that can change over the time and adapt to the changing conditions of the environment, thus, making possible to detect risky circumstances in IT organization and providing recommendations and approvals that can help to mitigate possible unwanted risk [32]. Through agent learning capability, they can demonstrate efficiently the proactive and autonomous behavior of participating agents in mitigating risks in real time [32, 33]. [30] added that the interactions among these agents were subsequently developed by analyzing several risk identification and mitigation processes. [30] stated that the use of multi-agent Modelling can be an alternative decision making tool for risk mitigation and collaboration. Thus Knowledge Mapping and software agent are to be used as techniques in the model to assist in risk mitigation decision making in IT organization. Agents assist managers and other decision makers to identify, sufficiently early, the risks associated with IT organization, development, integration, and deployment so that appropriate mitigation strategies

64

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org can be implemented on a timely basis [29]. Software agents create and store explicit knowledge as declarative memory through mapping a process by which explicit knowledge is created from information and stored in repositories for repetitive and routine querying [25]. 3.

METHODOLOGY

This section discusses the methodology that has been considered in implementing the RMS tool. Four phases were covered in developing the tool. The 4 different phases are as follows:    

Phase – 1: Literature Review Phase – 2: RMS Pseudocodes and Agents Algorithms Phase – 3: Adopt a System Development Life Cycle Phase – 4: Code the RMS Tool

Figure. 1 Methodology for RMS Implementation Figure 1 shows the phases followed to implement the risk mitigation system. 3.1 Methodological process The phases and activities carried out to implement the RMS tool are shown below; a.

Phase 1: Literature Review Phase 1 encompasses the reviewing of journals, articles and books on exiting risk tools aimed at mitigating risk in IT process. This phase helps in identifying the strength and weakness of exiting tools. b.

Phase 2: RMS Pseudocode and Agent Algorithms This phase involves the development of risk pseudocode and agents algorithms. The various agents that assist in making the risk decisions in mitigating risk are shown in this phase. c. d. e. f.

Phase 3: Adopt SDLC Phase 4 involves the adoption of the system development life cycle in the implementation and coding of the risk mitigation system. Phase 4: Code the RMS Tool Phase 5 involves the programming of the system using PHP for the agents and functions, MYSQL for the risk knowledge base via knowledge mapping technique, JavaScript for the system and CSS for a responsible and useable system. 3.2 Risk decision process

This section explores on the risk decision process. The risk decision process helps to make decision in mitigating risk in IT organizations. The risk decision process is seen in Figure 2.

65

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure. 2 Risk decision process Figure 2 shows the risk decision process. The process assists in providing support to practitioners in making decisions on risk mitigation. 3.3 Risk impact and probability measurement Risk impact and probability measurement is the systematic process to understand the nature of risk (by finding, recognizing and describing the risks) and to deduce the level of risk (by assigning values to impact and their probability). Risk impact and probability measurement provides the basis for risk decisions about risk treatment as seen in Figure 3, Table 1 and Table 2 in this paper. During the risk decision phase, practitioners have to consider each identified risk and make a judgment about the probability and impact of that risk. Practitioners normally rely on their judgment and experience of previous projects and the problems that arose in them. It is not possible to make precise, numeric assessment of the probability and impact of each risk. Once the risks have been measured and ranked, practitioners can make decisions on which of these risks are most significant. Risk decisions depend on a combination of the probability of the risk arising and the impact of that risk. In general, the most serious risks should always be considered [34].

Figure. 3 Consequences and probability matrix [35]

66

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org Figure 3 shows the consequence and probability matrix is used by the measurement agents to measure the identified risk magnitude based on the risk impact and risk probability. Table. 1 Probability scoring guideline for risk mitigation Value 9-10 8-7 6-5 4-3 1-2

Probability Level Very likely to occur Will probably to occur Equal chance of occurring or not Probably not occur Very unlikely

Table. 2 Impact scoring guideline for risk mitigation Value 9-10 8-7 6-5 4-3 1-2

Impact Level Very Significant Significant Significant and insignificant is equal Insignificant Very insignificant

Table 1 and Table 2 shows the probability and impact scoring guideline for mitigating risk based on the measurement of the risk to be mitigated. The measurement is used by the agents to measure the risk. The application of this can be seen in Figure 6 and Figure 7 in this research paper. 3.4 Best practice suggestion Best practices are based on previous risk mitigation cases acquired by capturing information and identifying critical success and failure factors in risk decisions. A knowledgebase of risk operational and technical risk identified was populated with a summary of both internally and externally used case studies. A description of the risks including risk event drivers, mitigation strategies implemented, risk impact and probability constitute the database of case studies. Therefore, practitioners will be able to locate previous risk mitigation activities via a collaborative environment or a risk mitigation system. 3.5 Risk data storage The knowledge base is collation of information captured from practitioners’ know-how, lessons learnt (in-depth internal expertise); case studies (internal and external case-based knowledge), best practices (external benchmarking) and risk mitigation standards. The access to such a knowledge means that the model is capable of enabling the use of past successes and failures captured to mitigate risks in IT organization. The risk data to be stored are basically knowledge elicitation of lessons learnt is the extension of case-based studies which capture in-house past experiences in more detail. The success of a risk decision can be enhanced by considering successes and failures of previously completed risk mitigations. In other words, a success factor can be derived from historical lessons learnt; otherwise previous mistakes can be repeated leading to failures. Furthermore, the lessons learnt also help identify location of critical risk items which are identified based on success factors from lessons learnt. 3.6 Reuse of risk data The reuse of risk data in risk decision not only provides an avenue for transferring excellence from several sources into the risk mitigation process, but also serves to populate the database with respect to identification of operational and technical risk and mitigation strategies. Additionally, data can be reused from different aspects of identified operational and technical risk depending on the specific role in the team, background, experience and personality. Reusing of risk data is designed to generate lessons learnt and build onto the knowledge on completion of each risk decision. 4.

RISK MITIGATION SYSTEM IMPLEMENTATION This section explores in detail on how the risk mitigation system (RMS) is developed and implemented;

67

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 4.1 System development life cycle methodology This phase involves the development methodology used for the development of the risk mitigation system (RMS). System development life cycle (SDLC) methodology defines a proper detailed process to specify, implement and test/debug agent-oriented software systems. SDLC methodology offers a set of detailed guidelines that includes examples and heuristics, which help better understanding what is required in each step of the development. (SDLC) is used to develop the prototype system. SDLC comprises of five phases; requirement analysis, system design, implementation, testing and evolution as seen in Figure 4.

Figure. 4 SDLC methodology This phase shown in Figure 4 aids to develop a risk mitigation system based on the risk decision process shown in Figure 2 to provide support to practitioners in making decisions based on a risk knowledge base. a.

Requirement analysis

This phase determines who is going to use the system. In this research the users of the system are the Experts, Manager and staffs. This phase also determines how users will use the system either via a web-based system or application based platform. Requirement analysis defines what data should be inputted into the system, the operational and technical data, risk documents and files. Requirement analysis also defines what data should be output by the system, which includes the risk report and risk documents/files to be used by decision makers for risk mitigation in IT organizations. b. System design This phase determines the system architecture to be used in deploying the risk mitigation system. 3-Tier Client/Server Architecture, Multi-platform application was used to run the system. This phase also determines the system interface design. Thus, the system interface is simple and easy to use in making decision in mitigating risk as seen in Figure 6 and Figure 7 in this paper. c.

Implementation

This phase involves the coding of the risk mitigation system (RMS) using HTML5, CSS, PHP, MySQL, and JavaScript to support practitioners in mitigating operational and technical risk in IT organizations. d. Testing This phase involves running the developed risk mitigation system (RMS) in XAMPP software to test the implemented system against the requirements and detect any errors and bugs in the implemented RMS tool.

68

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org e.

Evaluation

This phase involves the future functions that can be integrated into the system to see what improvement can be made. It involves the evolution of the RMS in response to changing practitioner needs. 4.2 Risk mitigation system pseudocode The RMS tool uses software agents to collect practitioners’ inputs, risk history and procedures which can assist to mitigate each risk separately for those mentioned in the operational and technical risk table. Risk table is a list of risks encountered by the agent in previous projects. When the particular risk is run for the first time, it will not have any entries in the risk table. The table is formulated by learning the environment. The agent observes the changing trends in IT organizations and suggests suitable modifications in the procedures. From this experience, the learning element can formulate a rule as to which risk mitigating method can be retained, which can be removed, which is outdated etc. [33]. Each of the different risks has different risk mitigation advice and monitoring process. The multisoftware agents are also possessing learning ability as seen in the pseudocode below; function risk-based-learning-agent (percept) returns an action static: percepts, a sequence, initially empty history: generic operational & technical risks, a sequence of possible risks in any project append generic operational & technical risks percepts to the end of percepts to the end of percepts action ) Step1: Pre- Knowledgebase is meta data structure to a valid Knowledgebase Post- Knowledgebase risk status Step2: Return: Boolean, false: Risk Knowledgebase empty; Else true: Risk Knowledgebase contains data Step3: If (Risk details not new) Result = false Else Result= true Return result of Risk Knowledgebase End empty Risk Knowledgebase b) Measurement agent The measurement agent assist in measuring the risk based on the risk impact and risk probability. This agent assigns different colors on the risk based on the measurement of the risk. The measurement agent saves the result of the measured risk to assist the practitioners in risk decisions in mitigating both operational and technical risk in IT organization. Algorithms for Measurement Agent Agent RiskMeasurement (K, P, I) K Knowledgebase; P Risk Probability; I Risk Impact RISK information to be measured by the measuring agent. Step1 :{ Measures the risk probability} If P=0 then Printf (‘This operational or technical risk cannot be measured’) Return Step2 :{ Measures the Impact of the risk} RISK=K [I] Step3 :{ If impact is also 0, set risk probability and risk impact to 0} If P=I then R=low If P=>I then R= Medium If P>1 & I> 1 then R= High {Otherwise measuring agent will reverse the Comparism} Else If I=P then R=low If I=>P then R= Medium If I>P & P> 1 then R= High Return (RISK) 4.4.2 a)

Best practice suggestion module

Evaluation agent

Evaluation agents retrieves the risk impact and risk probability results from the risk measurement agent and also retrieves the best practices on how to mitigate the identified risk. Algorithms for Evaluation Agent Evaluation Agent (E) K Knowledgebase, R Risk Evaluation Result Step 1: Evaluation Agents stores the risk evaluation details

71

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org {Check for Knowledgebase if risk already exists} If R=K then Printf (‘Risk already exist in knowledge’) Printf (‘Do you wish to add risk evaluation comment’) Return Step 2 :{ Increment risk pointer details} (R=R+1 [K]) Step 3: {Check Risk values} If RSIZE then Printf (‘Knowledgebase is full’) Return Step2 :{ Increment rear risk} k=R-F+1 Step 3 :{ Add new risk information at rear end of Knowledgebase risk table} K=R+1 Step 4 :{ If initially, the knowledgebase is empty, the agent adjust the front r=Risk} If F =0, then Return (Risk) Printf (‘Knowledgebase is empty, operational or technical risk data can be added, Please Proceed’) 4.4.4 a)

Reuse of risk data module

Knowledge conversion agent

The knowledge conversion agents mainly convers the data from the MySQL knowledgebase to the HTML format for the practitioners, in making risk decisions for mitigating risk in IT organization. The looping of the risk is used to arrange the risk in the knowledge base, once a new risk is added by the expert the knowledge base is restructured and the new risk is added at the front of the risk list in the knowledge base. If the risk is searched the knowledge conversion agent converts and displays the newest risk in the risk list based on which list is added last.

72

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org Algorithms for Knowledge Conversion Agent Algorithm for Knowledge Conversion agent queue (ref risk ) Step 1: Pre- risk is metadata structure to a valid risk Checks to be sure that the risk knowledgebase is not empty If Knowledgebase is not empty Step 2: Proceeds to retrieve (risk.front not null) risk.front = risk.front->next Convert (risk format) Step 3: End conversion risk.front = 1 Then Printf (‘Risk Report Successfully Generated’) Else Printf(‘Knowledgebase is empty’) Return End risk report conversion 4.4.5 a)

Risk decision module

Risk monitoring agent

The monitoring agent updates the practitioners involved in mitigating the risk, by automatically sending email update notification on the particular risk once any practitioners adds new information on the risk. The monitoring agents makes sure new risk data are added to the risk report, to aid risk decision in mitigating risk in IT organization. Algorithms for Risk Monitoring Agent Algorithm is Monitoring agent (val Email) system users Step 1: Risk mitigation data is in Knowledgebase If Risk is been treated; System sends initial risk report System Select expert user Printf (‘Initial Risk Treatment Status Successfully Sent’) Return available risk report; Step 2: If (risk treatment successfully implemented) Email Result = true Printf (‘Risk Treatment Update Successfully Sent’) Step 3: Else if expert user adds new risk comment Return Send email risk comment result Printf (‘Risk Comment has been added’) Else Step 4: If no new risk update Return existing report b) Risk report agent The report agent converts the HTML report the practitioners view in his/her computer or device and generates a PDF file format of the report using JavaScript for risk decision in mitigating risk. The risk mitigation system has been developed using software agents and knowledge mapping. Thus, seven agents are incorporated to assist in decision making in mitigating operational and technical risk in IT organization. Figure 5 shows the risk mitigation system architecture for making risk decision to mitigate risk in IT organization. Below are the modules and risks agent in the system architecture, the software agents communicate and works together, providing support to practitioners in risk decision in mitigating risk.

73

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Algorithms for Risk Report Agent Agent GenerateRiskReport (R) R Agent Step1: {Check risk mitigation information values} If Risk Data0 Step2 :{ There is existing Risk Information, display Risk information values} For R Generate Report Value F-Report to R-Report Print (R) Return existing report R= R+1 Other components of the system architecture include; 4.4.6

Knowledge base

Provides adequate support in the reuse of lessons learnt, best practices knowledge from previous projects to provide assistance and expertise in an effective way to mitigate risk. This knowledge can be useful to team members who are not familiar with current risks; the knowledgebase is quantified using opinions of experts. Less-experienced users can benefit from access to this expertise. The knowledgebase contains a knowledge map. The knowledge map shows which knowledge is used, this knowledge is useful for what risk, what is its relationship with other knowledge. The knowledgebase also updates and edits the knowledge by removing the outdated knowledge and collecting the feedbacks of decision makers about the application of knowledge in their decision [36]. 4.4.7

Experts/operational/technical data

This is the risk data (tacit knowledge) that the experts add in to the risk mitigation system. This knowledge is stored in the knowledge base and used for future risk mitigations in IT organizations [36]. 4.4.8

Decision makers

These are the practitioners i.e. decision makers, staff, managers or stakeholders that use the system to search for risk by sending a request. The decision makers send the request in a browser such as Internet explorer by using the query interface by clicking on the search button to confirm the information that the user entered in the query form, report agent sends the requests to the knowledgebase and translates this request into PDF documents (explicit knowledge) [36]. 4.4.9

Mapping process

The mapping technique involving storing data into the knowledge base, and is responsible for its database maintenance. Knowledge Mapping is used in this process to store risk mitigation strategies in knowledge base. It may update existing knowledge which is outdated and not in use and also can remove the knowledge that is determined by experts. With use of knowledge map, agents can retrieve relevant knowledge for decision makers more effectively, because the knowledge map shows the relationship between knowledge and their usage. A knowledge map provides indexes to real knowledge, whether it is an actual map, a cleverly constructed database. It is a guide like Yellow Pages that shows where to find resources and knowledge. The knowledge map transforming tacit knowledge into explicit knowledge. Such transformation can clearly display tacit knowledge by texts, categories, and graphics. Because the knowledge mapping is a method of knowledge expression and a mechanism of knowledge storage, applies the mind mapping to build a knowledge map that can be easily understand by staff and management in the organization [36]. 4.4.10

RMS interface

This phase in the software development life cycle (SDLC) involves the coding, deploying and running of the risk mitigation system (RMS) in XAMPP server locally to test the system to ensure the system is operational. This phase ensures that the system can assist practitioners in risk decision in mitigating risk in IT organization, thus this section

74

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org show how risk decision is being carried out for mitigating risk in IT organizations. The identified risk decision process; Measure risk impact and probability, suggest best practices, stores risk data and reuse risk data. The implemented risk mitigation system (RMS) is based on this process. Based on Figure 2 in this paper, the risk decision process is assists to provide support to practitioners in making decision on what action to take in mitigating the identified risk. 4.4.11

Measure risk impact and probability

To mitigate operational and technical risk, the risk experts adds the operational and technical risk that occurs in the organization such as computer crash, software failure or error, malware attack, communication infiltration, social engineering attack, technical failure, theft, misuse of system resources etc. These risks are added in to the enterprise knowledge base by the knowledge mapping process. The expert also selects the risk impact and probability scale, as shown in Figure 6 and Figure 7 and then the measurement and analysis agent measures the risk impact and probability of the risk and assigns color to the risk impact and probability based on the risk rating entered by the experts.

Figure. 6 Risk impact selection

Figure. 7 Risk probability selection

75

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure. 8 Measurement and analysis agent measures risk The best practices on previous risk mitigation are carried out by risk experts and are mapped and stored in the knowledge base. Thus, current practitioners can search for the information on how to mitigate risk and apply this knowledge in mitigating present operational and technical risk as seen in Figure 9.

Figure. 9 Best practice suggestion 4.4.12

Best practice suggestion

The best practices are stored in the knowledge base. Knowledge collection agent assist in mapping practitioners search strategies on how to mitigate and reduce operational and technical risk, utilizing the risk mitigation system (RMS) as shown in Figure 10.

76

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure. 10 Risk comment for best practice suggestion 4.4.13

Reuse of risk data

The reuse of risk data assists practitioners to download E-Report, containing information on how to mitigate and reduce operational and technical risk. The report comprises of the risk impact, risk probability and other risk mitigation information that is used for making decision on how to mitigate the risk.

Figure. 11 Reuse of risk data for decision making 4.4.14

Risk report for decision making

Figure 12 shows the interface of the risk report. This module shows the information on the identified risk treatment is to be carried out using reuse concept via reuse of risk data. The data from the knowledgebase is converted to the HTML format for the practitioners, in making risk decisions for mitigating risk.

77

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure. 12 Risk report for risk decisions for mitigating risk 5.

DISCUSSION AND CONCLUSION

The developed risk mitigation system architecture in Figure 5 was used to implement the risk mitigation system (RMS). SDLC methodology was used to develop the agent bases risk system. The RMS was developed using probability and impact matrix to assist the agents in measuring the risk to provide support to practitioners in making risk decision relating to risk mitigation in IT organization. The system development methodology (SDLC) was followed in developing the risk mitigation system (RMS) to provide support to practitioners in risk decision for mitigating risk. The SDLC process involves requirement analysis, system design, implementation, testing and evolution. The risk mitigation tool was programmed using PHP MYSQL, for the agent and knowledgebase development and JavaScript for developing a usable and robust system. MySQL with TCP/IP network protocol is used to connect the risk knowledgebase with PHP using knowledge mapping process. The risk measurement component is called to access necessary information from knowledgebase, such as the probability and impact of each risk descriptions, to show the magnitude of the risk. The prototype allows experts to store and manage data on the system, by analyzing potential risk and its description and show potential risk mitigation result for each risk. Thus, the RMS tool has been developed in order to tackle the limitations of the existing risk mitigation tools. The risk mitigation system produces risk reports for risk communication and risk tracking in a user-friendly and has inbuilt flexibility for re-configuration in a platform independent and provides user control over system inputs. RMS caters mainly for mitigating risk, since existing tools don’t establish mechanisms for mapping and storing risks treatment process related to IT process for continuously update the knowledge to be gathered. The tool produced results during risk decision phase by providing support to decision makers. Therefore, RMS is developed to identify measure, treat and provide support to practitioner in mitigating perceived operational and technical risks that occurs in IT organization. The core of the research is the reasoning methodology that not only supports the decision-making process of the user, but also aids the risk knowledge retrieving, storing, sharing and updating process of IT organization. Therefore,

78

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org RMS aims to provide a systematic approach to mitigate potential risk in IT organization, thus preventing system failures to occur. It is designed to be used as a decision support tool for decision-makers, such as practitioners, IT expert and management. RMS tools possesses the capabilities to support practitioners in mitigate risks, the tools can capture and reuse the lessons learnt from previous experience and best practices, to utilize and share the previous as well as existing knowledge and experience to assist in decision making via the mapping of knowledge. Future works involves extending the research to other organizations and mitigating more risk such as strategic risk, environmental risk etc. ACKNOWLEDGEMENT We are grateful to the Ministry of Education Malaysia for the financial support of this project. REFERENCES 1. 2. 3. 4. 5.

6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17.

18.

19. 20.

Noraini, C. P., Bokolo, A. J., Rozi, N.H. N. and Masrah, A.A. M. 2015. A Review on Risk Mitigation of IT Governance. Information Technology Journal. 14(1). 1-9. Junchao, X., Osterweil, L. J., Chen, J., Wang, Q. and Mingshu, L. 2013. Search Based Risk Mitigation Planning in Project Portfolio Management. ICSSP’, 2013, May 18–19. 146-155. Seung, H. H., Kima, D. Y., Kim, H. and Jang, W.S. 2008. A web-based integrated system for international project risk management. Journal of automation in construction. 9 (26). 342 – 356. Khoo, Y. B., Zhou, M. and Kayis, B. 2005. An agent-based risk management tool for concurrent engineering projects. Complexity International Journal. 12(1). 1-11. Kayis, I. B., Zhou, M., Savci, S., Khoo, Y.B., Ahmed, A., Kusumo, R. and Rispler, A. 2007. IRMAS – development of a risk management tool for collaborative multi-site, multi-partner new product development projects. Journal of Manufacturing Technology Management. 18 (4). 387-414. Jabiri, K. B., Magnussen , C., Tarimo, C. N. and Yngström, L. 2008. The Mitigation of ICT Risks Using Emitl Tool: An Empirical Study. IFIP TC-11 WG11.1 &WG 11.5 Joint Working Conference, 157-173. Saint, G. R. 2005. Information Security Management Best Practice Based on ISO/IEC 17799. Information Management Journal. 39 (1). 60-66. ITGI. 2004. COBIT 3rd edition. Executive Summary, USA. Jakub, M. and Janusz, G. 2001. Software support for collaborative risk management. Proceeding of 8th International Conference on Advanced Computer Systems, October 17-19, 2001 Mielno, Poland, 1-9. Lientz, B. P. and Larssen, L. 2006. Risk Management for IT projects: how to deal with over 150 issues and risks. Risk management practices. 1(1). 1-15. Basit, S., Al, O. Y. and Abdullah, A. 2011. Trivial model for mitigation of risks in software development life cycle. International Journal of the Physical Sciences, 7(1). 2072-2082. Mohd, N. F., Banwet, D. K. and Shankar, R. 2007. Information risks management in supply chains: an assessment and mitigation framework. Journal of Enterprise Information Management. 20(1). 677-699. Ahdieh, S. K. and Ow, S. H. 2012. An innovative Model for optimizing Software Risk Mitigation Plan: A case Study. Sixth Asia Modelling Symposium IEEE computer society, 220-224. Pankaj, R.S., Whiteman, L. E. and Malzahn, D. 2004. Methodology to mitigate supplier risk in an aerospace supply chain. Supply chain management an international journal. 9(1). 154-168. Junchao, X., Leon, J. O., Jie, C., Qing, W. and Mingshu, Li. 2013. Search Based Risk Mitigation Planning in Project Portfolio Management. International Conference on Small Science (ICSS), 146-155. Ahdieh, K., Hashemitaba, N. and Ow, S. H. 2014. A Novel Model for Software Risk Mitigation Plan to Improve the Fault Tolerance Process. IJITCM. 1(1). 38-42. Vu, T. and Liu, D. B. 2007. A Risk-Mitigating Model for the Development of Reliable and Maintainable Large-Scale Commercial-Off-The-Shelf Integrated Software Systems. IEEE Proceedings Annual Reliability and Maintainability Symposium, 361–367. Xu, R., Pei, Y. N. S., Ying, L. H. Q. and Yun, T. L. 2005. Optimizing Software Process Based On Risk Assessment and Control. Proceedings of the 2005 the Fifth International Conference on Computer and Information Technology, 1–5. Emmanuele, Z., Bolzoni, D., Etalle, S. and Salvato, M. 2009. Model-Based Mitigation of Availability Risks. IEEE Conference, 75-83. Ayad, A. K. and Hashim, K. 2000. A model and prototype tool to manage software risks. IEEE international conference, 297-305.

79

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 21. 22.

23. 24. 25. 26. 27. 28. 29. 30. 31. 32. 33. 34.

35. 36.

Michael, R. L., Jinsong, S. Y., Keramidas, E. and Dalal, S. R. 1995. ARMOR: Analyser for Reducing Module Operational Risk. IEEE Conference, 137-142. Thamer, A., Sulaiman, S. and Salam, R. A. 2009. Project Management Using Risk Identification Architecture Pattern (RIAP) Model: A case study on a Web-based application. 16th Asia-Pacific Software Engineering Conference, 449-456. Rajes, S. H. and Alexander, S. M. 2009. Application of Web Based Supplier Risk Assessment for Supplier Selection. Proceedings of the 2009 Industrial Engineering Research Conference, 2259-2264. Gulcin, Y., Cebi, S. Hoegec, B. and Ozok, A. F. 2011. A fuzzy risk assessment model for hospital information system implementation. Expert Systems with Applications. 39(1). 1211–1218. Pratim, D. and Acar, W. 2010. Software and human agents in Knowledge Codification. Knowledge Management Research and Practice. 8(1). 45-60. Suresh, R. H. and Egbu, C. O. 2004. Knowledge mapping: concepts and benefits for a sustainable urban environment. 20th Annual ARCOM Conference, 905-916. Gangcheol, Y., Dohyoung, S., Hansoo, K. and Sangyoub, L. 2011. Knowledge-mapping model for construction project organizations. Journal of Knowledge Management. 15(1). 528–548. Ali, S. S. B. Masoumeh, Z. and Mohd, Z. A. R. 2014. A Comprehensive Review of Knowledge Mapping Techniques. Journal of Information Systems Research and Innovation. 1(1). 71-76. John, D., Isaac N. and Admire, K. 2009. Intelligent Risk Management Tools for Software Development. SACLA, ACM, 33-40. Mihalis, G. and Michalis, L. 2011. A multi-agent based framework for supply chain risk management. Journal of Purchasing and Supply Management. 3(1). 23–3. Fu, R., Yue, X., Song, M. and Xin, Z. 2008. An architecture of knowledge management system based on agent and ontology. The Journal of China Universities of Posts and Telecommunications. 15(1) 126–130. Javier, B., María, L. B. Juan, P., Juan, M. C. and María, A. P. 2012. A multi-agent system for web-based risk management in small and medium business. Journal of Expert Systems with Applications. 39(1). 6921-6931. Shikha, R.. and Selvarani, R. 2012. An Efficient Method of Risk Assessment using Intelligent Agents. Second International Conference on Advanced Computing and Communication Technologies, 123-126. Noraini, C. P., Bokolo, A. J., Rozi, N. H. N. and Yusmadi, Y. J. 2015. Proposing a Model on Risk Mitigation In IT Governance. Proceedings of the 5th International Conference on Computing and Informatics, (ICOCI 2015), 11-13 August, 2015 Istanbul, Turkey, 737-742. Bokolo, A. J. and Noraini, N. C. 2015. A Review on Tools of Risk Mitigation for Information Technology Management. Journal of Theoretical and Applied Information Technology. 11(1). 92-101. Bokolo, A. J., Noraini, C. P., Teh, M. A., Rozi, N.H. N. and Yusmadi, Y. J. 2015. Autonomic Computing Systems Utilizing Agents for Risk Mitigation of IT Governance. Jurnal Teknologi. 77(18) 49-60.

AUTHOR PROFILE

80

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

SPHandler: Multi-criteria based intelligent stakeholder quantification system for value-based software 1

Muhammad Imran Babar, 2Masitah Ghazali, 3Dayang N.A. Jawawi, 4Falak Sher 1

Army Public College of Management and Sciences, Rawalpindi, Pakistan. 1,2,3,4 Faculty of Computing, UTM, 81310 Johor, Malaysia. Email: [email protected], [email protected], [email protected], [email protected] ABSTRACT Value-based is a popular term for software systems which deal with financial matters. For development of valuebased software (VBS) systems the software requirements engineering (RE) process must be robust. The RE process is concerned with exploration, verification and validation of key software requirements for a given VBS system. Highly valuable requirements are required in order to design a VBS product. A set of highly valuable requirements can only be obtained from key stakeholders. Stakeholders are the entities considered as vital in success and failure of the software system. Different approaches are represented by researchers in the domain of stakeholders’ analysis. The existing stakeholders’ analysis approaches focus the problem in a diverse range of dimensions. The processes used in these approaches vary from each other and the use of stakeholder attributes is not uniform. The approaches are highly obscure in terms of guidelines for value quantification of stakeholders. In this research, a neuro-fuzzy inspired intelligent decision support system SPHandler is proposed for identification and quantification of VBS stakeholders using stakeholder metrics, Neural Network and Fuzzy C-Means in order to cope with issue of stakeholder quantification. The proposed system is efficient in terms of performance and reduces the bias induced by the experts during stakeholders’ evaluation. Keywords: software requirements engineering; value-based software; stakeholders; decision support system; artificial intelligence; neural networks; fuzzy c-means; clustering; 1.

INTRODUCTION

The very term value-based is associated with value-based software engineering (VBSE). The value-based software (VBS) systems are the part of VBSE and are developed based on economic theory. Boehm states the term VBSE as “the explicit concern with value concerns in the application of science and mathematics by which properties of computer software are made useful to the people” [1]. The value of the VBS system is measured in terms of its economic worth or market leverage. The value of a VBS system is also taken into account based on its human services [2]. The economic aspect of the VBS systems distinguishes them from other traditional software applications. In VBS development, the value-based approaches are applied in order to understand the different economic aspects of the VBS system. “The value-based approach to software development integrates value considerations into current and emerging software engineering principles and practices, while developing an overall framework in which these techniques compatible reinforce each other” [3]. In VBS systems, an innovative idea is introduced for realization. Requirements engineering (RE) is a set of core software engineering (SE) practices related to software requirements. RE is a sub-domain of SE. In VBS development the value-based RE (VBRE) approaches are adopted. Brooks has said “the hardest single part of building a software system is deciding precisely what to build... Therefore, the most important function that the software builder performs for the client is the iterative extraction and refinement of the product requirements” [4]. Different researchers have defined RE differently. As stated by Zave “requirements engineering is the branch of software engineering concerned with the real-world goals for, functions of, and constraints on software systems. It is also concerned with the relationship of these factors to precise specifications of software behavior, and to their evolution over time and across software families” [5]. A requirement is written in a textual form, is elicited by the stakeholders and developers, and their interrelationships are managed manually [6]. Keeping in view the current business scenarios the stakeholders are seeking innovative business solutions for different software applications. Innovation is the cause of complexity in the design and development of innovative and value added systems. The complexity is the result of unclear user requirements and objectives. The complexity of the requirements can be determined through functional and non-functional aspects which are termed as function requirements (FRs) and the non-FRs (NFRs) or quality features [7]. Due to this trend of innovation different innovative RE techniques are considered as critical tools in order to remove ambiguity in the requirements [8]. Requirements elicitation phase (REP) is considered as highly significant in RE [4, 9]. RE is a challenging discipline in software development in order to make a software successful [10] and is associated with stakeholders’ identification, requirements elicitation and then implementation of the elicited requirements [11].

81

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org The success of a VBS system is directly associated with a set of key FRs and NFRs. If a software application or system meets the basic requirements criteria, then the software project is successful and vice versa. The key set of valuable requirements can only be obtained from key stakeholders after applying core VBRE practices during different VBRE processes. The well managed requirements have an immense effect on the quality of the final software product [12, 13]. The well managed requirements can only be obtained after segregation of critical stakeholders from the whole universe in the domain of a given software project. Tom Gilb defines a stakeholder as “any person or organizational group with an interest in, or ability to affect, the system or its environment” [14]. Stakeholders are key players in the RE process and possess an effective influence on the success of the software project. In order to make the project successful the functional features of the software must be in align with the critical needs or requirements of the users. The key functional aspects in case of VBS development can only be achieved by involvement of critical stakeholders in the RE process. Stakeholders usually belong to different cultures and domains. Hence, it is necessary to analyse and prioritize the stakeholders for professional software development [11, 15]. The users or stakeholders can only be satisfied if the system meets all the key needs of the stakeholders [16, 17]. A software system of good quality can only be achieved if the requirements will be of high value. Babar et al. state that “the software quality is measured based on the performance of the system, the services provided by the system and the acceptance of a system by the intended community” [18]. Different stakeholders interpret needs differently which makes the decision process complicated. Hence, it is required to find out and to add key stakeholders in the software development life cycle and they may play a vital role in adding value to the software project. Rest of the paper is divided into 5 main sections. Section 1 is about related works in which the current literature about the research is analysed thoroughly. Section 2 is about problem formulation in which a solution is given for the stakeholder problem. Section 3 is a detailed description of the fuzzy c-means and proposed system. Section 4 describes the results. Section 5 concludes the whole study. 2.

RELATED WORKS

Researchers have presented different SIQ approaches for identification of critical stakeholders. The existing SIQ approaches use different stakeholders’ analysis processes and attributes for their identification and quantification. One of the most prominent theories in stakeholders’ analysis is Mitchell’s theory. Mitchell has divided the stakeholders into eight major categories based on stakeholders’ attributes of power, legitimacy and urgency. The reported categories of stakeholders in this research are dominant stakeholders, discretionary stakeholders, dormant stakeholders, dependent stakeholders, demanding stakeholders, non-stakeholders, definitive stakeholders and dangerous stakeholders [19]. According to Mitchell’s stakeholders’ theory, the relationship of a stakeholder can be with any attribute or all of the three attributes of power, legitimacy and urgency [19]. “A group has power to the extent it has access to coercive, utilitarian or normative means for imposing its will in the relationship. Legitimacy is a social good and more over-arching than individual self-perception and is shared amongst groups, communities or cultures” [20]. Figure 1 depicts the Mitchells’ Stakeholder Salience Theory.

Figure. 1 Mitchells’ stakeholder salience theory [19]

82

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org Pacheco and Garcia state “there is still no Stakeholder Identification Process (SIP) framework or uniform description” [21-23]. Among the current SIQ approaches, it is very difficult to decide that which approach is best for a given scenario. Among all stakeholder analysis approaches there are few approaches, may be one or two, which focus on prioritization of stakeholders while remaining focus on their identification. Using current approaches the identification and selection of valuable stakeholders is very difficult. The stakeholders’ analysis approaches identify stakeholders based on their relationships, roles and influence [15, 19, 24, 25] without telling the details at process or activity level. They provide a very high level picture of the business entities. There are some approaches that do not take into account the aspects of relationships, roles and influence [26, 27]. The methods used in [28], for identification of stakeholders in an agile environment, are Freeman’s method [29, 30] and Mitchell’s model [19]. Freeman’s method has divided stakeholders into two groups at a higher level of abstraction that are primary stakeholders and secondary stakeholders. In [31, 32] an approach is presented based on roles and types for identification and selection of stakeholders. An approach is proposed for identification and selection of stakeholders at inter-organizational level. The stakeholder’s features which are considered in this approach are function, knowledge abilities, geographical position and level of the hierarchy. Though the approach is very powerful but the issue is how one will measure the level of competency among different stakeholders using these four features. A problem which is detected after implementation of the approach is associated with the profiles of different stakeholders. The four features are used to make the profiles of different stakeholders, but there are different stakeholders who have the same profiles with different ambitions or the FRs. The approach is proposed and implemented in order to find out all possible stakeholders for a given software system due to which lot of time is consumed. The approach is not cost efficient due to higher time consumptions. The selection of stakeholders is not based on their prioritization or quantification. PisoSIA® (Stakeholder Identification and Analysis) approach is an extension in the existing approach called PISO® (Process Improvement for Strategic Objectives) [33]. The only suggestion which is given in PISO® framework is associated with the importance of stakeholders and their analysis. The extension in the PISO ® framework is only supporting the stakeholders’ identification and analysis and the impact of these identified stakeholders on the quality of the system. PisoSIA® is used to identify stakeholders after incorporating a change in an information system. Then the impact of that change on current stakeholders is analyzed or measured. The incorporation of change helps in identification of some new stakeholders or business entities who may add some value to the system. For stakeholder identification, Mitchell’s model is integrated with PISO® that helps in the adoption of stakeholders’ attributes for their selection. The focus of both PISO ® and PisoSIA® is stakeholders’ engagement without giving any consideration to their importance. The effectiveness of PisoSIA® approach is associated with the early findings if the early findings are correct then the approach will be effective or vice versa. In the case of PisoSIA® the early findings are incorrect due to which the actual efficiency of the technique is difficult to calculate. A research called “ERP-implementation project from a stakeholder perspective” is conducted in [20]. In this research, Mitchell’s model is used that helps in the classification of stakeholders at higher level of abstraction. Just like PisoSIA® a change is incorporated in order to identify new stakeholders and to measure the impact of change on existing stakeholders. There is no novel contribution regarding SIQP that may help in the selection of valuable stakeholders for VBS systems. A change is incorporated in an ERP system and the impact of this change is analyzed on the stakeholders. SIQP is not the focus of this research directly. The “stakeholder identification precedes any other RE activity: we must first determine who they are and how important they are” [34]. The different features are presented by Glinz and Wieringa for identification of stakeholders that include interest in the system, must manage, introduce, operate, or maintain the system, involvement in the development, business responsibility, financial interest, constrain the system as regulators and negatively affected by the system. The stakeholders are quantified into three major categories that are critical, major and minor. For stakeholder identification and classification, the process level details are not given. Thus, it is difficult to adopt the model when a project or product requires an agile environment in terms of its execution, implementation or development. A framework of stakeholder identification and selection is given in [35] and is comprised of three stages that are identification, filtering and prioritization. The framework is a conceptual overview of the necessary aspects that must be considered essential during SIQP and to make RE process effective. The proposed framework provides a very high level picture and “the framework may not be conclusive as it needs to be confirmed and refined further”. The framework is generic and is not yet implemented in real scenarios. There are some other approaches that do not focus the SIQP itself rather they discuss the stakeholders in a casual way without due consideration. The overview of these approaches is given here.

83

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org In HyDRA [36] the too much stress is given on the viewpoints of the users in order to gain information about multiple requirements’ resources and requirements’ traceability links. There is no special consideration of stakeholders as such. In QSARA (Qualitative Systematic Approach to Requirements Analysis) the major stress is on the “stakeholders have little knowledge or understanding of the domain or the initial document is poorly structured” [37]. The approach is helpful in identification of resource allocation and interdependencies. It also helps analysts to construct a complete description of system features. The approach lacks in providing a solid SIQP and focuses on the stakeholders who lack in domain knowledge. The Activity Theory for Requirements Elicitation (ATRE) focuses on contextualized activities of the stakeholders and it defines general activities that are strongly based on the knowledge and skills of the stakeholders. ATRE is based on the concept of social property which presents knowledge from social sciences that can be useful in gaining new insights into the human context in a given software system. However the elaboration of social activities is very difficult because “they require the collaboration of heterogeneous teams on wide topics” [38]. The approach directly involves the stakeholders in order to negotiate the requirements without providing any mechanism for SIQ. Same is the case with the research performed by Kasirun and Salim that only focuses on the involvement of stakeholders during requirements elicitation based on the aspects like activity, environment and support of the tool [39]. SIQP is not the focus of this research. The research study conducted by Woolridge et al. is also based on major stakeholder categories and the SIQP is based on these high-level categories. The categories are financial supporters, customers, internal stakeholders, external stakeholders, special interest stakeholders, and influencer stakeholders [40]. In this research, the major stress is on the risk imposed by the stakeholders and the impact of stakeholder risk is calculated. Williams et al. have presented a study in which a list of enterprise stakeholders is given who may involve in the commercial development. The framework is being proposed for enterprise software development coordination in which the stakeholders are identified based on “their role and their name (e.g. Bob Smith: director of marketing)”. This framework helps in the enhancement of “collaboration across enterprises engaged in software development projects” [41]. For existing work practices, it is not a hard and fast rule. A comprehensive literature review is conducted on the said problem of stakeholders for VBS development in [18]. Babar et al. have forced that there is a dire need of a new framework, based on stakeholders’ metrics for VBS development due to their distinguished nature. However, there are two types of the SIQP problems that are process problems and technical problems. The four major process problems of the SIQP are highly complex, nonuniform, inconsistent, and time-consuming. The technical problems of the SIQP are non-existence of metrics, lack of low level details and use of limited stakeholders’ attributes. In this research, fuzzy logic based intelligent system in order to identify and quantify the stakeholders of VBS systems. This research is in continuation of the research as presented in [42]. 3.

PROBLEM FORMULATION

In this section, the stakeholder problem is formulated in order to solve the stakeholder quantification issue. Currently the datasets about stakeholders do not exist in any database hence, for stakeholders’ analysis data the metrics are proposed. In order to quantify the stakeholders the stakeholder metrics are used. The stakeholder value Sv is calculated based on the nine key stakeholder metrics called as stakeholder factors. These factors are risk factor, communication factor, skill factor, instability factor, interest factor, personality factor, hierarchy factor, legitimacy factor and environmental factor. Finally, Sv is calculated by subtracting value of γ factors from value of β factors. However, in this research we are not subtracting value of γ factors from value of β factors instead we will use the values of γ and β factors as an output during training of the BPNN. Later, the predicted values of γ and β factors will be given to fuzzy c-means algorithm in order to achieve a most critical cluster of the stakeholders which may add an economic leverage in the development of a VBS system. The values of the stakeholder metrics or factors are based on stakeholders’ aspects which are described here. FactorF AspectT Definition: Here F is a notation for stakeholders’ factor or metric and T represents the stakeholders’ attribute or aspect. The stakeholders’ aspects are evaluated on a ranking scale of 0 to 5. The stakeholders’ attributes considered for the two metrics are included after long discussions with industry professionals. Hence, we have two types of datasets the stakeholders’ skill factor dataset and the interest factor dataset. Stakeholders’ Risk Factor (FSR)

84

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org The stakeholders’ risk factor is comprised of the aspects like communication (TCM), interpretation (TIT), decision making (TDM), cognitive load (TCL), complexity (TCP), language barriers (TLB), time and geographic differences (TTG). 𝐹𝑆𝑅 = 0.2(𝑇𝐶𝑀 + 𝑇𝐼𝑇 + 𝑇𝐷𝑀 + 𝑇𝐶𝐿 + 𝑇𝐶𝑃 + 𝑇𝐿𝐵 + 𝑇𝑇𝐺 ) + 0.2 𝑛

𝐹𝑆𝑅 = 0.2 (∑ 𝐹𝑆𝑅𝑖 ) + 0.2

(1)

𝑖=1

In the equation 1, i is an element or aspect of the FSR and the total number of aspects is represented by n. The FSS is calculated using values of aspects that are in the range of 0 to 5 and FSR is in the range of 0.2 to 5.2. Stakeholders’ Instability Factor (FSI) The stakeholders’ instability factor is comprised of the aspects like immune to challenges (TCH), cognitive load or workload (TCL), and fatigue management (TFM). 𝐹𝑆𝐼 = 0.2 (𝑇𝐶𝐻 + 𝑇𝐶𝐿 + 𝑇𝐹𝑀 ) + 0.2 𝑛

𝐹𝑆𝐼 = 0.2 (∑ 𝐹𝑆𝐼𝑖 ) + 0.2

(2)

𝑖=1

In the equation 2, i is an element or aspect of the FSI and the total number of aspects is represented by n. The FSS is calculated using values of aspects that are in the range of 0 to 5 and FSI is in the range of 0.2 to 5.2. Stakeholders’ Communication Factor (FSC) The stakeholders’ communication factor is comprised of the aspects like clarity (TCR), objectivity (TOB), and self-confidence (TSC). 𝐹𝑆𝐶 = 0.2(𝑇𝐶𝑅 + 𝑇𝑂𝐵 + 𝑇𝑆𝐶 ) + 0.2 𝑛

𝐹𝑆𝐶 = 0.2 (∑ 𝐹𝑆𝐶𝑖 ) + 0.2

(3)

𝑖=1

In the equation 3, i is an element or aspect of the FSC and the total number of aspects is represented by n. The FSS is calculated using values of aspects that are in the range of 0 to 5 and FSC is in the range of 0.2 to 5.2. Stakeholders’ Skill Factor (FSS) The stakeholders’ skill factor is comprised of the aspects like experience (TEX), managerial abilities (TMA), domain knowledge (TDK), domain training (TDT), and self-esteem (TSE). 𝐹𝑆𝑆 = 0.2 (𝑇𝐸𝑋 + 𝑇𝑀𝐴 + 𝑇𝐷𝐾 + 𝑇𝐷𝑇 + 𝑇𝑆𝐸 ) + 0.2 𝑛

𝐹𝑆𝑆 = 0.2 (∑ 𝐹𝑆𝑆𝑖 ) + 0.2

(4)

𝑖=1

In the equation 4, i is an element or aspect of the FSS and the total number of aspects is represented by n. The FSS is calculated using values of aspects that are in the range of 0 to 5 and FSS is in the range of 0.2 to 5.2. Stakeholders’ Interest Factor (FSIT) The stakeholders’ interest factor is comprised of the aspects like domain scope knowledge (T DSK), business knowledge (TBK), and objectivity (TOB). 𝐹𝑆𝐼𝑇 = 0.2 (𝑇𝐷𝑆𝐾 + 𝑇𝐵𝐾 + 𝑇𝑂𝐵 ) + 0.2

85

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 𝑛

𝐹𝑆𝐼𝑇 = 0.2 (∑ 𝐹𝑆𝐼𝑇𝑖 ) + 0.2

(5)

𝑖=1

In the equation 5, i is an element or aspect of the FSIT and the total number of aspects is represented by n. The FSS is calculated using values of aspects that are in the range of 0 to 5 and FSIT is in the range of 0.2 to 5.2. Stakeholders’ Personality Factor (FSP) The stakeholders’ personality factor is comprised of the aspects like cooperative (TCO), visionary (TVI), inspirer (TIN), performer (TPR), knowledge sharer (TKS), role model (TRM), and influence (TIN). 𝐹𝑆𝑃 = 0.2 (𝑇𝐶𝑂 + 𝑇𝑉𝐼 + 𝑇𝐼𝑁 + 𝑇𝑃𝑅 + 𝑇𝐾𝑆 + 𝑇𝑅𝑀 + 𝑇𝐼𝑁 ) + 0.2 𝑛

𝐹𝑆𝑃 = 0.2 (∑ 𝐹𝑆𝑃𝑖 ) + 0.2

(6)

𝑖=1

In the equation 6, i is an element or aspect of the FSP and the total number of aspects is represented by n. The FSS is calculated using values of aspects that are in the range of 0 to 5 and FSP is in the range of 0.2 to 5.2. Stakeholders’ Hierarchy Factor (FSH) The stakeholders’ hierarchy factor is comprised of the aspects like executive position (T EP), mid-career (TMC), and entry-career (TEC). The value of FSH is taken account based on the current position of the stakeholders. 𝑇𝐸𝑃 = High = 5 𝑇𝑀𝐶 = Average = 3 𝑇𝑀𝐶 = Low = 2 𝐹𝑆𝐻 = 𝑇𝑉𝑎𝑙 𝑇𝑉𝑎𝑙 = 𝑉𝑎𝑙𝑢𝑒 𝑜𝑓 𝐻𝑖𝑒𝑟𝑎𝑟𝑐𝑦 𝐴𝑠𝑝𝑒𝑐𝑡𝑠 Stakeholders’ Legitimacy Factor (FSLG) It is used to measure the intensity of legitimacy of a stakeholder in terms of needs and is described as high, average, and low. High = 5 Average = 3 Low = 2 𝐹𝑆𝐿𝐺 = 𝑇𝑉𝑎𝑙 𝑇𝑉𝑎𝑙 = Intensity of Legitimacy Stakeholders’ Environment Factor (FSE) The stakeholders’ environment factor is comprised of the aspects like cognitive Load (TCL), fatigue management (TFM), inspirer (TIN), and knowledge sharer (TKS). 𝐹𝑆𝐸 = 0.2 (𝑇𝐶𝐿 + 𝑇𝐹𝑀 + 𝑇𝐼𝑁 + 𝑇𝐾𝑆 ) + 0.2 𝑛

𝐹𝑆𝐸 = 0.2 (∑ 𝐹𝑆𝐸𝑖 ) + 0.2

(7)

𝑖=1

In the equation 7, i is an element or aspect of the FSE and the total number of aspects is represented by n. The FSS is calculated using values of aspects that are in the range of 0 to 5 and FSE is in the range of 0.2 to 5.2.

86

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

𝑆𝑉 = (𝐹𝑆𝐶 + 𝐹𝑆𝑆 + 𝐹𝑆𝐼𝑇 + 𝐹𝑆𝑃 + 𝐹𝑆𝐻 + 𝐹𝑆𝐿𝐺 + 𝐹𝑆𝐸 ) − (𝐹𝑆𝑅 + 𝐹𝑆𝐼 ) 𝑛

𝑚

𝑆𝑉 = ∑ 𝛽𝑖 − ∑ 𝛾𝑗 𝑖=1

(8)

𝑗=1

𝛽 = {𝐹𝑆𝐶 + 𝐹𝑆𝑆 + 𝐹𝑆𝐼𝑇 + 𝐹𝑆𝑃 + 𝐹𝑆𝐻 + 𝐹𝑆𝐿𝐺 + 𝐹𝑆𝐸 } 𝛾 = {𝐹𝑆𝑅 + 𝐹𝑆𝐼 }

(9) (10)

In Equation 8 β and γ are the values of the factors which show the positive and negative aspects of a stakeholder. Two exceptions are given based on values of β factors and γ factors. If the value of β factors is low then the value of γ factors will be high and vice versa. From experimental data it is observed that in most of the cases the two exceptions are not going to be fulfilled in many cases. This results in complexity and equation 8 makes the decision process more complex, hence in this research equation 9 and equation 10 are used for stakeholder quantification. Initially, back-propagation neural network is used in order to predict the value of β and γ factors and these values are given as an input to Fuzzy C-Means algorithm in order to identify the stakeholders. 4.

PROPOSED INTELLIGENT SYSTEM: SPHANDL

The proposed intelligent system SPHandler is comprised of stakeholder metrics, BPNN and FCM. Figure 2 describes the proposed intelligent system for SIQ. The interface is used as a means for input to the system. The input of stakeholders’ parameters is of two types, which is based on values of β factors and γ factors for the BPNN and finally the predicted β and γ values serve as an input to fuzzy c-means. The stakeholders’ parameters are given to the system in order to compute the values of FSC, FSS, FSIT, FSP, FSH, FSLG, FSE, FSR and FSI. The computed values of the β factors and γ factors are stored in the database or in a text file. The β and the γ factor values are given as data input to the BPNN. The BPNN predicts the β and γ values for stakeholders’ classification. The β and γ values are given as an input to fuzzy c-means in order to make the stakeholders’ clusters for priority definition. For FCM initialization the required parameters are number of clusters N, the value of exponent or fuzzification parameter E, I is for total number of iterations and the parameter of initial centroid is also defined. After initialization the centroids are computed and membership values Uij are modified. If the value of Euclidean norm ||… || is less than ε the algorithm is going to stop otherwise the centroids are updated again using equation 11 and the loop continues until the matrix norm value is not less than ε. Here, ε is the predefined accuracy or convergence criterion. If the threshold value of ε is achieved the system will show the results in the form of different clusters of the data otherwise the algorithmic loop will continue in order to achieve the optimal results of the problem.

87

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure. 2 Proposed intelligent system 5.

EXPERIMENTAL SETUP 5.1 Back-propagation neural network

BPNN is a multilayer perceptron (MLP) normally comprised of one hidden layer [43]. The input layer serves as a backbone for neurons while the neurons in hidden and output layers perform computation. The signals in NN propagate in a feed-forward mechanism. The computational heuristics of the hidden and output layers cannot be observed. The very nature of neural networks is like information processing system and also possess similarity with biological nervous systems [44, 45]. Neural networks (NN) are used for prediction and to solve the highly complex and non-linear problems [46, 47]. In case of SIQP high level of uncertainty exists due to application of different opinions about stakeholders’ attribute values. This thing makes the SIQP highly complex and non-linear. Currently, the research lacks in the application of heuristic approaches in the domain of SIQP. In StakeMeter SIQ framework the stakeholder value Sv is calculated manually by using different stakeholder factors. The manual application of StakeMeter framework parameters reduces the overall performance of the system, and induces the bias of experts. Moreover, it is non-scalable for projects with large number of stakeholders which is a big reason to propose an expert SIQ system. Figure 3 represents a 3-layered feed-forward BPNN. The bias b is added for learning purposes. The colourless circles on the most left side represent the weighted input to the input layer. The initialization of the multiplied weights w is random and the training of the network is carried out in order to reduce the mean squared error (MSE) for optimization of the results. The over training results in the reduction of the performance of the BPNN and its occurrence mainly depends on large number of training cycles [48]. The BPNN

88

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org training consists of 4 major steps of initialization, activation, weight adjustment and iteration. The training in BPNN is an iterative process which reduces the MSE using back-propagation algorithm in order to acquire the optimized results.

Figure. 3 Back-propagation neural network (BPNN) For calculation of outputs in the hidden layer, based on the given inputs, the sigmoid activation function is used which is represented by following equation. 𝑛

𝑦𝑗 (𝑝) = 𝑠𝑖𝑔𝑚𝑜𝑖𝑑 [∑ 𝑥𝑖 (𝑝). 𝑤𝑖𝑗 (𝑝) − 𝜃𝑗 ]

(11)

𝑖=1

In Equation 11 n is the total number of inputs for neuron j. For activation function the word sigmoid is a reserve word. Output layer calculates the final output by using the following equation. 𝑚

𝑦𝑘 (𝑝) = 𝑠𝑖𝑔𝑚𝑜𝑖𝑑 [∑ 𝑥𝑗𝑘 (𝑝). 𝑤𝑗𝑘 (𝑝) − 𝜃𝑘 ]

(12)

𝑗=1

In Equation 12, m is the total number of inputs to neuron k from the hidden layer. 5.2 Data collection The supervised learning approach for BPNN is adopted in this research. Historical data is highly desirable in case of supervised machine learning approach. The historical data is comprised of cases, examples and all class instances. In supervised learning the data is used in order to predict an object’s class [49]. As the data in case of stakeholders’ quantification is not available hence, the StakeMeter framework’s factors are used in this research. The training dataset is obtained from the proposed 9 factors which are FSC, FSS, FSIT, FSP, FSH, FSLG, FSE, FSR and FSI. Each factor is comprised of different stakeholder aspects or attributes. The cardinal scale is used to rank these stakeholder aspects in the range of 0 to 5. Each stakeholder factor value is calculated using values of respective stakeholder aspects. The stakeholder values serve as an input to NN and are represented as: pi = {p1, p2, p3, …, pn}

89

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

The stakeholder aspect values are selected by the experts as per defined ranking criteria of StakeMeter framework. The aspect values are used to calculate stakeholder factor values. The factor values are used to find out β factors value and γ factors value which serve as an input to BPNN. The β and γ values depict the positive and negative impacts of a stakeholder for a given VBS project. In StakeMeter the Sv is calculated by subtracting γ values from β values. However, such an approach makes the decision process complex and induces fuzziness. In the proposed intelligent system instead of computing the Sv only the values of β and γ are predicted using BPNN and later used as input for Fuzzy C-Means in order to obtain a cluster of highly prioritized stakeholders. Table 1 describes the partial data sample for β and γ values computation. Table. 1 Partial data sample β Factor Values FSC 0.2 2.2 0.8 1.2 2.6 2.4 2.4 1.2

FSS 0.6 1.4 2.4 2.8 2.0 3.4 3.2 2.2

FSIT 1.4 2.4 1.6 0.8 1.2 2.0 1.8 0.6

FSP 0.8 5.0 3.2 3.4 1.8 5.2 4.0 1.6

FSH 3 4 3 3 3 2 3 2

γ Factor Values FSL 3 4 3 3 3 2 3 2

FSE 1.6 2.8 1.4 2.6 1.4 1.8 1.8 1.4

FSR 0.2 0.4 1.4 1.8 4.4 1.0 1.2 4.8

β Value

γ Value

10.6 21.8 15.4 16.8 15.0 18.8 19.2 11.0

0.4 3.4 2.4 2.0 7.6 1.8 2.0 7.2

FSI 0.2 3.0 1.0 0.2 3.2 0.8 0.8 2.4

5.3 Neural network training The two most common learning approaches are supervised and unsupervised learning. In BPNN the supervised learning approach is adopted for training purposed. Different training methods exist in order to train the BPNN like traingdx, trainlm, trainoss, trainbr, traincgf and some others. However, trainlm is the default, efficient and commonly used training function. In this research, we have used trainlm function in order to train the BPNN for our proposed expert system. The data is normalized in the range of -1 to 1 by applying functions of mean and standard deviation. The normalization helps in pre-processing of training data and in evenly distribution. For all possible classes of the target data a partial sample is purposely chosen and is shown in the Table 2. The training of BPNN is carried out using the default training function trainlm. The β factor values and γ factor values are given as target input while β and γ values are given as target output. NN is trained by applying different architectures and architecture of 9-16-2 (number of input – number of hidden nodes – number of output) is selected as an optimized solution for the said problem. Such an optimization process helps in resolving the stakeholder quantification problem. Table. 2 Partial sample of results of NN training using trainlm β Factor Values FSC 0.2 0.2 0.2 0.2 0.2 0.2 0.2 0.2 0.2

FSS 0.2 0.4 0.6 0.8 1.0 1.2 1.4 1.6 1.8

FSIT 0.2 0.2 0.2 0.2 0.2 0.2 0.2 0.2 0.2

FSP 0.2 0.2 0.2 0.2 0.2 0.2 0.2 0.2 0.2

FSH 2 2 2 2 2 2 2 2 2

FSL 2 2 2 2 2 2 2 2 2

FSE 0.2 0.2 0.2 0.2 0.2 0.2 0.2 0.2 0.2

γ Factor Values FSR FSI 0.2 0.2 0.2 0.4 0.2 0.6 0.2 0.8 0.2 1.0 0.2 1.2 0.2 1.4 0.2 1.6 0.2 1.8

β Value

γ Value

NN β Value

NN γValue

5.0 5.2 5.4 5.6 5.8 6.0 6.2 6.4 6.6

0.4 0.6 0.8 1.0 1.2 1.4 1.6 1.8 2.0

5.0073 5.1972 5.3990 5.6042 5.8067 6.0053 6.2020 6.3986 6.5964

0.4029 0.5995 0.7999 0.9994 1.1977 1.3987 1.5993 1.7976 1.9956

90

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org … 3.2 3.2 3.2 3.2 3.2 3.2 3.2 3.2 3.2

5.2 5.2 5.2 5.2 5.2 5.2 5.2 5.2 5.2

3.2 3.2 3.2 3.2 3.2 3.2 3.2 3.2 3.2

7.2 7.2 7.2 7.2 7.2 7.2 7.2 7.2 7.2

… 4 4 4 4 4 4 4 4 4

4 4 4 4 4 4 4 4 4

2.6 2.8 3.0 3.2 3.4 3.6 3.8 4.0 4.2

5.6 5.8 6.0 6.2 6.4 6.6 6.8 7.0 7.2

3.2 3.2 3.2 3.2 3.2 3.2 3.2 3.2 3.2









29.4 29.6 29.8 30.0 30.2 30.4 30.6 30.8 31.0

8.8 9.0 9.2 9.4 9.6 9.8 10.0 10.2 10.4

29.4001 29.5949 29.7946 29.9978 30.2036 30.4071 30.6039 30.7939 30.9923

8.8112 8.9781 9.2224 9.3863 9.5940 9.8226 9.9736 10.2145 10.3945

0.4 0.6 0.8 1.0 1.2 1.4 1.6 1.8 2.0 …

NN β Value 5.0073 5.1972 5.3990 5.6042 5.8067 6.0053 6.2020 6.3986 6.5964 …

Table. 3 Error difference NN Error Difference β Value γValue 0.4029 -0.0073 0.5995 0.0028 0.7999 0.0010 0.9994 0.0006 1.1977 -0.0067 1.3987 -0.0053 1.5993 -0.0020 1.7976 0.0014 1.9956 0.0036 … …

8.8 9.0 9.2 9.4 9.6 9.8 10.0 10.2 10.4

29.4001 29.5949 29.7946 29.9978 30.2036 30.4071 30.6039 30.7939 30.9923

8.8012 8.9881 9.2024 9.3963 9.5940 9.8026 9.9736 10.2045 10.3945

β Value

γ Value

5.0 5.2 5.4 5.6 5.8 6.0 6.2 6.4 6.6 … 29.4 29.6 29.8 30.0 30.2 30.4 30.6 30.8 31.0

-0.0001 0.0051 0.0054 0.0022 -0.0036 -0.0071 -0.0039 0.0061 0.0077

Error Difference γ Value -0.0029 0.0005 0.0001 0.0006 0.0023 0.0013 0.0007 0.0024 0.0044 … -0.0012 0.0019 -0.0024 0.0037 0.0060 -0.0026 0.0264 -0.0045 0.0055

The output predicted by trainlm function, along with error differences between the predicted and actual data values of β and γ, is shown in Table 3. The error difference is acceptable and the function trainlm provides an optimized solution for the problem. The results are taken after applying different architectures and the results found more optimized by applying the NN architecture of 9-16-2. The application of too many hidden nodes results in over-fitting problem hence it becomes difficult to generalize the results [50]. In this research we have chosen 4 different projects like online car showroom (OCSR), hospital management system (HMS), restaurant management system (RMS) and university web portal (UWP). There are 23 stakeholders of OCSR, 61 stakeholders of HMS, 121 stakeholders of RMS and 273 stakeholders of UWP. The stakeholder domain in case of UWP is comprised of a set of students and faculty members at university level. A total of 438 stakeholders are evaluated based on guidelines of StakeMeter. We have made 5 teams in order to evaluate the stakeholders and each team is comprised of 3 members. The values of all factors are computed and finally the β and γ values are predicted using NN based on the stakeholder factors which serve as an input to the NN. Table 4 shows the results of the validation data using selected NN architecture.

91

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org Table. 4 Partial validation data results using selected architecture β Factor Values FSC 2.4 2.8 2.0 1.8 2.2 1.0 1.4 2.4 1.2 1.4

FSS 2.8 3.0 2.6 3.2 1.6 2.0 2.2 1.8 1.4 2.0

FSIT 1.2 2.2 2.6 1.8 0.8 1.6 2.0 1.8 1.0 2.6

FSP 4.0 3.4 3.0 2.4 2.6 1.8 2.4 2.6 2.2 3.4

FSH 3 3 2 3 2 2 3 3 2 3

FSL 3 3 2 3 2 2 3 3 2 3

FSE 2.6 2.2 1.6 1.8 1.2 1.6 2.0 2.2 1.4 2.4

γ Factor Values FSR FSI 0.4 0.4 0.8 1.2 1.2 0.8 1.0 1.4 3.8 0.8 3.6 1.8 2.2 1.2 1.0 2.0 3.6 1.6 1.8 2.2

β Value

γ Value

NN β Value

NN γValue

19.0 19.6 15.8 17.0 12.4 12.0 16.0 16.8 11.2 17.8

0.8 2.0 2.0 2.4 4.6 5.4 3.4 3.0 5.2 4.0

18.9959 19.6015 15.7972 16.9988 12.3931 11.9919 15.9926 16.8032 11.1909 17.8012

0.8003 2.0008 2.0008 2.4002 4.5943 5.4089 3.4028 3.0018 5.1938 4.0005

5.4 Application of fuzzy c-means In order to make the SIQP efficient the predicted values of stakeholders’ metrics by BPNN are used as attributes in a fuzzy clustering technique. There are several clustering approaches of data however the data in the current research belongs to a fuzzy problem instead of the hard problem. Hence, based on fuzzy and hard problem the clustering can be divided into fuzzy clustering and hard clustering. In clustering the data possess two properties i.e. homogeneity and heterogeneity. In homogeneity the objects of one class or cluster are similar to each other while in heterogeneity the objects in one class are dissimilar to objects of other classes [51, 52]. The data points in fuzzy clusters possess a degree of fuzzification which shows their links with different clusters instead of showing their association with a single cluster. a.

Fuzzy c-means method

In order to get the prioritized clusters of the stakeholders we have applied fuzzy c-means (FCM) algorithm [53]. For software requirements prioritization FCM is used in VIRP approach [54]. For pattern recognition the improved FCM algorithm is used [55]. In FCM, the data objects are classified into different key clusters based on the attributes of the data objects. In FCM the position of the object is computed and is placed in the relevant cluster. The improved FCM consists of a fuzzification parameter m and the range of m is [1, n]. The fuzzification parameter m is used to measure the degree of fuzzification for a cluster. Normally, the value of m is 2. Initially, the centroid is based on a guess for FCM initiation and a centroid is a distance mean of a cluster. For distance matrix Euclidean distance formula is used as shown in equation 13. 𝑑

𝑑(𝑥, 𝑦) = √∑ |𝑥𝑖 − 𝑦𝑖 |2

(13)

𝑖=1

Each object in a cluster is given a membership rank. The update of the centroid and membership ranks is iterative and the position of centroid changes with respect to the dataset. The adjustment of the centroid depends on the objective function. The distance of an object from its respective centroid is measured by the objective function and the membership rank is weighted by it. A centroid is updated by Equation 14 and the Equation 15 is used to compute the membership rank. 𝑚

𝐶𝑗 =

𝑈𝑖𝑗 =

∑𝑖 [𝜇𝑗 (𝑥𝑖 )] 𝑥𝑖 ∑𝑖 [𝑢𝑗 (𝑥𝑖 )]

(14)

𝑚

1 2

(15)

‖𝑥 −𝑐 ‖ 𝑚−1 ∑𝑐𝑘=1 ( 𝑖 𝑗 ) ‖𝑥 −𝑐 ‖ 𝑖

𝑘

The objective function in the FCM is represented by Equation 16.

92

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 𝑁

𝐶

𝐽 = ∑ ∑ 𝑈𝑖𝑗𝑚 ∥ 𝑥𝑗 − 𝑉𝑖 ∥2

(16)

𝑖=1 𝑗=1

The fuzzification parameter m, also called as exponent, plays a vital role in experimental results. The values of β and γ predicted by NN are passed as an input to the fuzzy c-means. The data is highly fuzzy due to the unclear scenarios. It is described in SFM that a stakeholder may have a higher β value with a lower γ value and vice versa. However, it is also observed that some stakeholders have higher β and higher γ values and in such cases the decision making is very difficult. Based on Sv, β, and γ values it is very difficult to quantify the stakeholders due to the induced fuzziness in the data values. Hence, in this approach fuzzy c-means is used to quantify the stakeholders based on β and γ values. Table 5 shows the partial dataset which is comprised of β and γ values for the said objects or stakeholders. After reducing the number of inputs to 2 the problem has become a 2-D problem. Table. 5 Partial dataset sample for fuzzy c-means Objects

Β

γ

Stakeholder 104 Stakeholder 105 Stakeholder 106 Stakeholder 107 Stakeholder 108 Stakeholder 109 Stakeholder 110

… 18.9959 19.6015 15.7972 16.9988 12.3931 11.9919 15.9926 …

0.8003 2.0008 2.0008 2.4002 4.5943 5.4089 3.4028

The values of the different parameters used in fuzzy c-means are described in Table 6. N represents the stakeholder sample, d is dimensionality, m represents the degree of fuzzification, n represents the iterations, k is the total number of clusters and the convergence criteria or performance is represented by min_improvement. The various combinations of parameters of m and min_improvement are applied in order to evaluate the performance. Different values of fuzzification parameter m are applied in a descending order in order to know the effect on the overall clustering results. Table. 6 Fuzzy C-Means Parameters Stakeholder

N 438

d 2

m 2.5 2.5 2.5 2.0 2.0 2.0 1.5 1.5 1.5

n 100

k 2

min_improvement 10e-5 10e-05 10e-005 10e-5 10e-05 10e-005 10e-5 10e-05 10e-005

By applying different FCM parameters it is observed that the difference in clustering results is not prominent. Hence, the parameter values chosen in this research are m = 2.0, k = 2 and min_improvement = 10e-5. The iteration count is affected by the application of different parameters. Thus, for stakeholder quantification we have chosen the default parameters of FCM which produce most optimized results and are shown in bold in Table 6. 6.

EXPERIMENTS AND RESULTS

Figure 4 shows the quantification of stakeholders of the 3 projects into 2 clusters. The cluster centroids are represented by * and o. The trends of the clusters delineate a high variation in the β and γ values of the stakeholders. It is obvious most of the stakeholders possess higher β values which show that the entry of these stakeholders, in requirements elicitation phase of VBS development, is highly vital and on the other hand they also possess higher γ values which make them risky for the VBS project. However, from the clusters it is easy to define an inclusion and exclusion criterion for the stakeholders. It becomes easier to separate a critical set of stakeholders from non-critical stakeholders. Based on values of β and γ we can decide that the few stakeholders

93

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org in the ‘o’ cluster may not be vital at all due to their higher γ values. Hence, they may be excluded easily or we can place them in the second priority by keeping in view their β values. Figure 5 shows the total number of iterations in order to achieve the objective function.

Figure. 4 Data with 2 clusters

Figure. 5 Objective function for 2 clusters In order to refine the stakeholder selection criterion there is the need to divide the stakeholders into more than 3 clusters and from the bunch of 2 clusters it is difficult to place the stakeholders in sub-priorities. Another issue which is associated with 2 clusters is selection of a stakeholder based on some expert biasness and induction of fuzzification. Hence, more clusters may help in elimination of expert biasness and fuzzification in order to define the inclusion and exclusion criterion. In Figure 6, data is divided into 3 clusters and only change which occurs in FCM parameters is k = 3 and the rest of the parameters are same. The cluster centroids in Figure 7 are represented by o, + and *. The cluster ‘o’ possesses high priority among all 3 clusters. The + cluster also possess key stakeholders hence the selection analysis is described in Figure 8. However, the stakeholders in the cluster ‘*’ are discarded straightforwardly due to the imposed higher risk or greater γ values and higher β values.

94

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure 6: Data with 3 clusters

Figure 7: Data analysis with 3 clusters In Figure 7, the stakeholder values in cluster ‘o’ are the key stakeholders that must be the part of RE process. However, there are some stakeholders which have higher γ values or possess higher risk. Initially if we define base value of γ as 2.0 and β as 15.0 then we get a more refined set of stakeholders. This set of stakeholders can be considered in the initial phase of requirements elicitation for VBS development. If we move higher on the y axis and select γ as 3 and β is same, then there will be few stakeholders from cluster ‘o’ who will become the part of critical stakeholders and more will be added from cluster ‘*’. The industry professional may also choose intermediate values between 2 and 3. All these stakeholders are considered as vital and the stakeholders in the cluster + are not considered as the vital entities for the requirements elicitation process. The stakeholders in green

95

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org are the critical stakeholders for VBS development however the stakeholders in blue clusters are kept in second priority. 7.

COMPARATIVE ANALYSIS AND DISCUSSION

Critical requirements play a vital role in the VBS development. Industry professionals apply different techniques in order to find out a key set of requirements from stakeholders. Stakeholders are the key entities in requirements elicitation step of RE. Different researchers have presented different approaches in order to identify and quantify the stakeholders based on different methods and parameters under different scenarios. The application of diverse methods makes the stakeholder identification process quite difficult and non-uniform. Based on the proposed metrics of StakeMeter an intelligent system is proposed in this research. Using said approach or system it becomes easy to assess the worth of a stakeholder for a VBS under development. The said approach reduces the effort of the analysts in order to predict the worth of a stakeholder based on the StakeMeter framework. A large number of attributes are considered in the StakeMeter for evaluation of the stakeholders which makes the SIQ process highly complex. The application of AI approaches like NN and FCM reduces the complexity and effort in terms of time. The application of NN helps in predicting the approximations of β and γ values though near to original values with less error. The application of NN also reduces the extent of bias as induced by the experts during manual evaluation of stakeholder β and γ values. From the StakeMeter inclusion and exclusion criteria it is difficult to define a base line for the Sv or stakeholder value. Hence, in this approach the Sv equation is not considered in order to overcome the issue of inclusion and exclusion. We have considered only the β and γ values in order to predict the priority of a stakeholder using FCM. The FCM approach is helpful in making clusters of the stakeholders based on β and γ values. Using these values an expert can easily define the priority of a given stakeholder for the VBS development. However, there are some limitations of this approach. Firstly, the large number of attributes reduces the overall efficiency of the NN during prediction of β and γ values of a stakeholder. Secondly, this approach is not applied in terms of global SE scenarios where the stakeholders are distributed across the globe. 8. CONCLUSION RE is a key process in software development life cycle. During RE stakeholders play a vital role in elicitation of the requirements. The existing SIQ approaches focus the issue in diverse dimensions and do not provide a systematic way to quantify the stakeholders. The SIQ approaches are not uniform and it is difficult to adopt them as a standard or framework. These approaches are also not efficient in terms of time. VBS systems are associated with financial streams and an innovative idea is launched for market leverage. The quick implementation of the idea is highly desirable for VBS development. Hence, the suitability of current approaches is questionable for VBS development. In this research, a hybrid system is proposed in order to quantify the large number of stakeholders for VBS development. The stakeholder metrics are used to quantify the stakeholders and the stakeholder metrics values are used as an input to BPNN. BPNN predicts the value of a stakeholder and the predicted stakeholder values are given as an input to FCM. Later FCM classify the stakeholders into different clusters. The clustering approach helps in selection of highly critical stakeholders for VBS development. ACKNOWLEDGEMENT The authors would like to thank profoundly to the University Teknologi Malaysia (UTM) and Army Public College of Management & Sciences in order to complete this research. REFERENCES 1. 2.

3. 4. 5. 6.

Boehm, B., Value-based software engineering: reinventing. ACM SIGSOFT Software Engineering Notes, 2003. 28(2): p. 3. Babar, M.I., M. Ghazali, and N.A. Dayang, Jawawi, Software quality enhancement for value based systems through stakeholders quantification. Journal of Theoretical & Applied Information Technology, 2013. 55(3). Boehm, B. and L.G. Huang, Value-based software engineering: A case study. Computer, 2003. 36(3): p. 33-41. Brooks, F.P., No silver bullet: Essence and accidents of software engineering. IEEE computer, 1987. 20(4): p. 10-19. Zave, P., Classification of research efforts in requirements engineering. ACM Computing Surveys (CSUR), 1997. 29(4): p. 315-321. Marschall, F. and M. Schoemnakers. Towards model-based Requirements Engineering for web-enabled B2B Applications. in 10th IEEE International Conference and Workshop on the Engineering of Computer-Based Systems. 2003: IEEE.

96

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 7. 8.

9. 10. 11. 12. 13.

14. 15. 16. 17. 18. 19.

20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32.

33. 34.

Chung, L. and J.C.S. do Prado Leite, On non-functional requirements in software engineering, in Conceptual modeling: Foundations and applications. 2009, Springer. p. 363-379. Grube, P.P. and K. Schmid. Selecting creativity techniques for innovative requirements engineering. in Third International Workshop on Multimedia and Enjoyable Requirements Engineering-Beyond Mere Descriptions and with More Fun and Games, 2008, MERE'08. . 2008: IEEE. Davis, C.J., et al., Communication challenges in requirements elicitation and the use of the repertory grid technique. Journal of Computer Information Systems, 2006. 46(5): p. 78. Van Lamsweerde, A. Requirements engineering in the year 00: A research perspective. in 22nd international conference on Software engineering. 2000: ACM. Nuseibeh, B. and S. Easterbrook. Requirements engineering: a roadmap. in Proceedings of the Conference on the Future of Software Engineering. 2000: ACM. Davis, A., et al. Effectiveness of requirements elicitation techniques: Empirical results derived from a systematic review. 2006: Ieee. Babar, M.I., M. Ramzan, and S. Ghayyur. Challenges and future trends in software requirements prioritization. in International Conference on Computer Networks and Information Technology (ICCNIT). 2011. Abbottabad, Pakistan: IEEE. Gilb, T., Competitive engineering: a handbook for systems engineering, requirements engineering, and software engineering using Planguage. 2005: Butterworth-Heinemann. McManus, J. A stakeholder perspective within software engineering projects. in Engineering Management Conference. 2004: IEEE. Schulmeyer, G. and J. Mcmanus, The Handbook of Software Quality Assurance. 1999, Van Nostrand Reinhold Co., New York. Barrett, J.D., Quality From Customer Needs to Customer Satisfaction. Technometrics, 2004. 46(1): p. 118-118. Babar, M.I., et al., Stakeholder management in value-based software development: systematic review. IET Software, 2014. Mitchell, R.K., B.R. Agle, and D.J. Wood, Toward a theory of stakeholder identification and salience: Defining the principle of who and what really counts. Academy of management review, 1997. 22(4): p. 853-886. Boonstra, A., Interpreting an ERP-implementation project from a stakeholder perspective. International Journal of Project Management, 2006. 24(1): p. 38-52. Pacheco, C. and I. Garcia, A systematic literature review of stakeholder identification methods in requirements elicitation. Journal of Systems and Software, 2012. 85(9): p. 2171-2181. Pacheco, C. and I. Garcia. Stakeholder Identification Methods in Software Requirements: Empirical Findings Derived from a Systematic Review. 2008: IEEE. Pacheco, C. and I. Garcia. Effectiveness of Stakeholder Identification Methods in Requirements Elicitation: Experimental Results Derived from a Methodical Review. 2009: Ieee. Preiss, O. and A. Wegmann. Stakeholder discovery and classification based on systems science principles. 2001: IEEE. Young, S.M., et al. Quality and people in the development of situationally specific methods. 2001: IEEE. Elayne, C. and E. Tony, The role of the stakeholder in managing change. Communications of AIS, 1999. 2. Sharp, H., A. Finkelstein, and G. Galal. Stakeholder identification in the requirements engineering process. 1999: Ieee. Power, K. Stakeholder Identification in Agile Software Product Development Organizations: A Model for Understanding Who and What Really Counts. 2010: IEEE. Freeman, R.E., Strategic management, a stakeholder approach. 1984, Boston: Pitman. Freeman, R.E., J.S. Harrison, and A.C. Wicks, Managing for stakeholders: Survival, reputation, and success. 2007: Yale Univ Pr. Ballejos, L.C. and J.M. Montagna, Method for stakeholder identification in interorganizational environments. Requirements engineering, 2008a. 13(4): p. 281-297. Ballejos, L.C., S.M. Gonnet, and J.M. Montagna, A Stakeholder Model for Interorganizational Information Systems, in Requirements Engineering: Foundation for Software Quality, C.R. Barbara Paech, Editor. 2008b, Springer: Berlin / Heidelberg. p. 73-87. Davison, J., et al., PisoSIA® a stakeholder approach to assist change in information systems development projects and aid process improvement. Software Quality Journal, 2006. 14(1): p. 25-36. Glinz, M. and R.J. Wieringa, Guest Editor's Introduction: Stakeholders in Requirements Engineering. IEEE SOFTWARE, 2007: p. 18-20.

97

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 35. 36. 37. 38. 39. 40. 41. 42.

43. 44. 45. 46. 47. 48. 49.

50. 51. 52.

53. 54.

55.

Razali, R. and F. Anwar, Selecting the Right Stakeholders for Requirements Elicitation: A Systematic Approach. Journal of Theoretical and Applied Information Technology, 2011. 33(2): p. 250-257. Barber, K.S. and S.R. Jernigan. Hybrid domain representation archive (HyDRA) for requirements model synthesis across viewpoints (poster session). 2000: ACM. Al-Ani, B. and K. Edwards. An Empirical study of a qualitative systematic approach to requirements analysis (QSARA). 2004: IEEE. Fuentes-Fernández, R., J.J. Gómez-Sanz, and J. Pavón, Understanding the human context in requirements elicitation. Requirements engineering, 2010. 15(3): p. 267-283. Kasirun, Z.M. and S.S. Salim. Focus group discussion model for requirements elicitation activity. 2008: Ieee. Woolridge, R.W., D.J. McManus, and J.E. Hale, Stakeholder risk assessment: An outcome-based approach. Software, IEEE, 2007. 24(2): p. 36-45. Williams, C., et al. Supporting enterprise stakeholders in software projects. 2010: ACM. Babar, M.I., M. Ghazali, and N.A. Dayang, Jawawi, A bi-metric and fuzzy c-means based intelligent stakeholder quantification system for value-based software, in New Trends in Software Methodologies, Tools and Techniques, H. Fujita, Selamat, A., Haron, H., Editor. 2014, IOS International Publisher: Langkwai, Malaysia. LiMin, F., Neural networks in computer intelligence. Neural networks (Computer science), 1994. Picton, P., Introduction to neural networks. 2nd ed. 2000: Macmillan Publishers Limited. Fausett, L., Fundamentals of neural networks: architectures, algorithms, and applications. 1994: Prentice-Hall, Inc. Gallant, S.I., Neutral network learning and expert systems. 1993: MIT press. Thwin, M.M.T. and T.-S. Quah, Application of neural networks for software quality prediction using object-oriented metrics. Journal of Systems and Software, 2005. 76(2): p. 147-156. Chungen, Y., L. Rosendahl, and Z. Luo, Simulation Modelling Practice and Theory. Elsevier Science, 2003. 11: p. 211-222. Ali, A., M. Elfaki, and D. Norhayati, Using Naïve Bayes and Bayesian Networkfor Prediction of Potential Problematic Cases in Tuberculosis. International Journal of Informatics and Communication Technology, 2012. 1(2): p. 63-71. Hung, M., et al., Estimating posterior probabilities in classification problems with neural networks. International Journal of Computational Intelligence and Organizations, 1996. 1(1): p. 49-60. Rezanková, H. and T. Löster, Shlukova Analyza Domacnosti Charakterizovanych Kategorialnimi Ukazateli. E+ M Ekonomie a Management/E+ M Economics & Management, 2013. 2013(3). Rodriguez, A., M.S. Tomas, and J. Rubio-Martinez, A benchmark calculation for the fuzzy c-means clustering algorithm: initial memberships. Journal of Mathematical Chemistry, 2012. 50(10): p. 27032715. Dunn, J.C., A fuzzy relative of the ISODATA process and its use in detecting compact well-separated clusters. 1973. Ramzan, M., M.A. Jaffar, and A.A. Shahid, Value based intelligent requirement prioritization (VIRP): expert driven fuzzy logic based prioritization technique. International Journal Of Innovative Computing, Information And Control, 2011. 7(3). Cannon, R.L., J.V. Dave, and J.C. Bezdek, Efficient implementation of the fuzzy c-means clustering algorithms. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1986(2): p. 248-255.

AUTHOR PROFILE Dr. Muhammad Imran Babar received his MCS from Al-Khair University AJK Pakistan. He received his MS degree in Software Engineering from International Islamic University Islamabad Pakistan. He received his Ph.D. degree in Computer Science from Universiti Teknologi Malaysia (UTM) in 2015. Currently, he is serving as an Assistant Professor at Army Public College of Management & Sciences (APCOMS) Rawalpindi, Pakistan. APCOMS is an affiliate institute of University of Engineering and Technology Taxila, Pakistan. Moreover, he is Head of Department in Department of Computer Science at APCOMS. His research interests are in software engineering, decision support systems, artificial intelligence, neural networks, predictive analytics, fuzzy logic and big data.

98

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org Dr. Masitah Ghazali received her BE degree in software engineering from University of Manchester Institute of Science and Technology (UMIST), UK. She received her Master degree in Distributed Interactive Systems from Lancaster University, Lancaster UK. She earned her Ph.D. degree in Computer Science from Lancaster University, Lancaster UK. She is a Senior Lecturer at the Department of Software Engineering, Faculty of Computing, Universiti Teknologi Malaysia (UTM). Her research interests include human-computer interaction, software engineering, usability, human factors, interaction design, physicality, and user interfaces. Dr. Dayang N.A. Jawawi received her B.Sc. Degree in Software Engineering from Sheffield Hallam University, UK and M.Sc. and Ph.D. by research in software engineering from Universiti Teknologi Malaysia. She is an Associate Professor at the Department of Software Engineering, Faculty of Computing, Universiti Teknologi Malaysia (UTM). Her research areas are software reuse, component-based software engineering and embedded real-time software. In her Ph.D. works she proposed a practical approach for component-based software engineering of embedded software in resource constrained embedded systems such as autonomous mobile robots and intelligent instruments systems. Falak Sher received his MCS from Al-Khair University AJK Pakistan. He received his MS degree in Software Engineering from International Islamic University Islamabad Pakistan. Currently, he is pursuing his Ph.D. in Computer Science from Universiti Teknologi Malaysia. His research interests are in software engineering, decision support systems, artificial intelligence, software requirements engineering, and software quality assurance. He is an experienced teacher of computing at Beaconhouse School System Islamabad, Pakistan and has an experience of more than 15 years.

99

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

A proposed model for the usability of web based educational resources management systems 1

Hussein Ali Ahmed Ghanim, 2Nisreen Basher Osman

1

Department of Information Technology, Faculty of Computer Science, University of Kassala 2 Department of Computer Science, Bayan College for Science and Technology B. O. Box 210, Khartoum, Sudan Email: [email protected], [email protected]

ABSTRACT The main purpose of web based Educational Resources Management System (ERMS) is to deliver knowledge, share information and help learners in their learning activities in an effective and efficient way by involving advanced electronic technologies. However, the Usability of these systems that is the degree of these systems to enable their users to use them effectively, efficiently and with satisfaction in a specified context of use, is one of the challenges that face the design of these systems. This study proposes a model for evaluating the usability of ERMS. The model introduced effectiveness, efficiency, satisfaction, learnability, interactivity, consistency, motivation and learner’s control as the attributes that determine the usability of such systems. The model was tested and verified using questionnaires and experiments. The results showed that the effectiveness, efficiency, satisfaction, learnability, motivation, interactivity, consistency and learner’s control affects the usability of ERMS. Keyword: web based applications; educational resources management; usability; usability evaluation; e-learning; 1. INTRODUCTION Usability of ERMSs is a great significance because their success depends upon basic usability principles. This paper is an attempt to propose model for usability to design interactive electronic educational system for the management of various educational resources and makes education the way closer to electronic education in the traditional way. And focused at the interest on lectures, books, announcements, contact with students and courses, and then proposed model for usability of ERMS and use this model to evaluate this system, also been exposed to some of the terminology used and displayed earlier studies in usability and usability evaluation of educational Management Systems. This paper is to provide model for usability and defining practical, fast and low-cost tools in order to analyze the usability of Educational resources management system (ERMSs) according to this model. Kashif [8] pointed out that the basic purpose of e-learning applications is to deliver knowledge, share information and help learners in their learning activities in an effective and efficient way by involving advanced electronic technologies. Usability of e-learning applications is of great significance because their success depends upon basic usability principles. The criteria for judging the success can be defined by user satisfaction level after the user’s interaction with interface of e-learning system [8]. Appropriate use of usability evaluation methods according to given scenarios is an important aspect [8]. Both end-users and usability experts participated in the study, during used different methods for usability evaluation of specific e-learning platform It’s Learning [8]. 2. RELATED WORK Educational resources management systems: is a modern and powerful Management Information System, designed specifically to meet the challenges of the Education sector. It is the Management Information System (MIS) your institution needs, to support the dynamic environment you work in and add value to your business. Educational resources management: is a modern and powerful Management Information System, designed specifically to meet the challenges of the post Education sector. It manages the complete Learner life-cycle, from initial enquiry through to completion [47]. Educational Resources management should be flexible and scalable in order to support any learning and be adaptable to changes. Efficient and effective deployment of resources requires that educational resources management concepts and principles be used in all phases of resources management and learning response [47]. A Web-based application refers to any program that is accessed over a network connection using HTTP, rather than existing within a device’s memory. A web application is an application utilizing web and web browser technologies to accomplish one or more tasks over a network, typically through a web browser [46]. E-learning is the provision of training and educational programs through a variety of electronic media, including disks and the Internet in a manner synchronous or asynchronous, and the adoption of the principle of self-learning or teacher assistance learning [36]. E-learning can also be defined broadly as any use of Web and Internet technologies to create learning experiences [31]. The most comprehensive definitions of e-learning were: “E-learning is the use of Web and Internet technologies to create experiences that educate our fellow human beings 100

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org [31].” To expand on this definition and provide details, one could add that e-learning is facilitated and supported through the use of information and communications technology, e-learning can cover a spectrum of activities from supported learning, to blended learning (the combination of traditional and e-learning practices), to learning that is entirely online. Whatever the technology, however, learning is the vital element. Electronic Learning (or elearning) is a kind of technology supported education/learning (TSL) where the medium of instruction is through computer technology, particularly involving digital technologies [8]. The objectives of e-learning are to facilitate and assist people by delivering appropriate contents and services to fulfill user needs [8]. E-learning system: The basic purpose of e-learning applications is to deliver knowledge, share information and help learners in their learning activities in an effective and efficient way by involving advanced electronic technologies [35]. E-Learning system is special in its capability for co-operative and collaborative learning activities through asynchronous and synchronous communications to enhance learning effectiveness. It is also about meeting instructor and peer learners in the virtual community, solving problems together, and expecting feedbacks and interactions [35]. Web based educational systems (WBESs): WBESs offer interesting delivery mechanisms to teachers and learners [10]. Governance and accountability are key criteria to consider during the deployment of these WBESs [10]. Assessment of WBESs also needs to be done to determine its effectiveness [10]. E-Learning management electronic system (EMES): An integrated computer system that manages the educational process after the system aims to facilitate the process of interaction between the student and faculty member [35]. An E-learning management system is an integrated computer system to serve the educational process, where the system aims to facilitate the process of interaction between students and faculty, and its features are [35]:        

Course development Ease of use. Arabic and any Languages Support. Able to assess students Communication between student and faculty. Quality of scientific content design and using of the latest technology for educational means. Develop self-learning among students. Ease in management and support of the educational process

Usability: Usability is a term which refers to the interaction of users with any software systems [15]. ISO 9241-11 [40] defined it as: “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use”. The definition of usability from ISO/IEC 9126 [42]: "Usability: a set of attributes that bear on the effort needed for use, and on the individual assessment of such use, by a stated or implied set of users". Another definition of the usability from ISO/IEC 9126-1 [41]: "The capability of the software product to be understood learned, used and attractive to the user, when used under specified conditions". Nielsen [15] introduced Learnability, Efficiency, Memorability, Errors and Satisfaction as attributes that determines the Usability. According to Gilbert Cockton [37] Usability Evaluation assesses the extent to which an interactive system is easy and pleasant to use. Usability evaluation: Usability evaluation is concerned with gathering information about the usability or potential usability of a system in order either to improve its interface or to assess it [21]. The aim is to determine the effectiveness or potential effectiveness of an interface in use or to provide a means of suggesting improvements to it [21]. Dix et al [3] suggested the main goals of evaluation. These are:  To assess the extent of the system functionality;  To assess the effect of the interface on the user; and  To identify the specific problems with the system. Usability of educational resources management systems: Usability plays an imperative role for the success of e-learning applications. If an e-learning system is not usable, the learner is forced to spend much more time trying to understand software functionality, rather than understanding the learning content [30]. Moreover, if the system interface is rigid, slow and unpleasant, people feel frustrated are likely to walk away and forget about using it [30]. Andrina et al., [1] pointed out that Usability of pedagogical systems is key feature in the pedagogy domain. According to, lack of an appropriate usable and user-centered interface design of different computerized educational systems decreases the interface’s effectiveness and efficiency [1]. Fitzpatrick & Ssemugabi [20] [25] pointed out that in order to evaluate the usability of system and to determine usability problems, it is important to

101

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org select appropriate usability evaluation /methods. Gray et al., [6] considering efficiency, time, cost-effectiveness, ease of application, and expertise of evaluators. According to Melis et al.,13] to designing an e-learning system which is more usable, basically involve two aspects:  Technical usability, which involves methods for ensuring a trouble-free interaction with the system [13].  Pedagogical usability, which aims at supporting the learning process. Both aspects of usability are intertwined and tap the user’s cognitive resources [13]. 3. METHODOLOGY The proposed Model: In order to assess the usability educational resources management system, the study proposes a model based on ISO 41-11 [40] and others usability models such as effectiveness, efficiency satisfaction, learnability, interactivity, consistency, motivation and learner’s control. As shown in figure (1) the attributes of the model are:

Efficiency

Added value

Effectiveness

U S A B I L I T Y

Satisfaction

Learnability

Interactivity

Feedback

Consistency

Motivation

Learner’s control

Figure (1) Proposed Model for the Usability of WEMRs

Efficiency: The system should be efficient to use, so that once the user has learned the system, a high level of productivity is possible. [13]. Added value: The added value is usually in the form of creative use of the possibilities that the computer offers, for example voice, image and video files: the learners can choose a media that best fits their preferences [19]. Effectiveness: The capability of the software product to enable users achieves specified tasks with accuracy and completeness [43]. The degree to which specified users can achieve specified goals with accuracy and completeness in a specified context of use [43]. Satisfaction: The system should be pleasant to use, so that users are subjectively satisfied when using it; they like it [13]. Learnability: The system should be easy to learn so that the user can rapidly start getting some work done with the system. [13].

102

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org Interactivity: Interactivity is supported through easy and user-friendly accessibility of the subject information and task-based activities. [27]. Feedback: The system should continuously inform the user about what it is doing and how it is interpreting the user's input [13]. Consistency: Consistency is one of the most basic usability principles. If users know that the same command or the same action will always have the same effect, they will feel more confident in using the system, and they will be encouraged to try out exploratory learning strategies because they will already have part of the knowledge needed to operate new parts of the system [13]. Motivation: The material provided by web based application should contain intrinsically motivating tasks and examples [19]. Learner-control: Describes the student’s ability to control the order in which they would like to perform activities [19]. 4. MODEL VALIDATION 4.1 Design of experiments The main purpose of the experiments is to measure the attributes of the model. The Experiment consists of five major tasks. Each task has different set of action to be performed by users. Each action covers different range of fields that cover main features of ERMS. Table 1 below shows the tasks selected for the experiments. Table. 1 Shows tasks of the experiments Tasks Task 1: Registration in the system Task 2: Access to the system to see existing resources Task 3: contact Services with students Task 4: download Lecture from the System Task 5: Search on the System

4.2 Measurement of variables The following are the variables used to measure the attributes of the model     

The number of steps. Time spend on task The number of errors Number of helps. The number of unfinished tasks 4.3 Sampling and users profiles

A total of 20 students were selected who were at different levels. Tables below shows the distribution of the users participated in the experiments. Table 2: Distribution of respondents by age Age

Frequency

Percent

from 20 to 25

17

68.0

from 25 to 30

5

20.0

from 30 to 35

3

12.0

Total

25

100.0

103

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org Table. 3 The number of additional steps & average time on tasks Tasks

Number of additional steps

Average of Time

Task 1 Task 2 Task 3 Task 4 Task 5 Total

1 1 1 1–2 0 5

1.95 1.75 1.85 1.5 1.85 8.9

Table. 4 The Total errors & helps & task not complete on tasks Total number of errors 12 1 0 2 4 19

Tasks Task 1 Task 2 Task 3 Task 4 Task 5 Total

Total number of helps 15 7 7 4 7 40

Total of task not complete 0 1 0 1 0 2

The results below illustrate: - The total number of steps to accomplish tasks - Total time to completion tasks Table. 5 Total steps & time on tasks Tasks

Total number of steps

Total Time

Task 1

88

39

Task 2

101

35

Task 3

108

37

Task 4

67

30

Task 5

120

37

The results below illustrate: - The average number of steps to accomplish tasks - Average time to completion tasks Table. 6 Average of steps & time on tasks Tasks

average number of steps

average Time

Task 1

4.4

1.95

Task 2

5.05

1.75

Task 3

5.4

1.85

Task 4

3.35

1.5

Task 5

6

1.85

104

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org The results below illustrate: - The average number of errors to accomplish tasks - The average number of helps to accomplish the tasks - The average number of unfinished tasks

Table. 7 Average of errors & helps & task not complete on tasks

Task 1

average number of errors 0.6

Task 2

0.05

0.35

0.05

Task 3

0

0.35

0

Task 4

0.1

0.2

0.05

Task 5

0.2

0.35

0

Tasks

average number of helps

average Task not complete

0.75

0

5. DATA ANALYSIS AND RESULTS Testing the hypotheses was performed on the basis of the level of significance. If the level of significance is greater than 5% (0.05), this means that the value of calculated Chi-Square is less than the value of tabular ChiSquare and therefore there is a statistical significance. In this case the null hypothesis will be rejected the alternative hypothesis (the research hypothesis) will be accepted. 5.1 The relation between learnability and usability To test whether there is a relation between Learnability and usability test was performed. The results reported in Table (8) below shows that there was significant difference between Learnability and usability Table. 8 Chi-Square test of the relation between Learnability and usability Hypotheses

Chi-Square

Sig. (P.value)

Df

Learnability affects Usability of WBERs.

8.376

0.215

4

5.2 The relation between motivation and usability To test whether there is a relation between Motivation and usability test was performed. The results reported in Table (9) below shows that there was significant difference between Motivation and usability.

Table. 9 Chi-Square test of the relation between motivation and usability Hypotheses

Chi-Square

Sig. (P.value)

Df

Motivation affects Usability of WBERs.

9.088

0.236

6

5.3 The relation between inactivity and usability To test whether there is a relation between Inactivity and usability test was performed. The results reported in Table (10) below shows that there was significant difference between Inactivity and usability. Table. 10 Chi-Square test of the relation between inactivity and usability Hypotheses

Chi-Square

Sig. (P.value)

Df

Interactivity affects Usability of WBERs

8.441

0.350

6

105

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 5.4 The relation between consistency and usability To test whether there is a relation between Consistency and usability test was performed. The results reported in Table (11) below shows that there was significant difference between Consistency and usability Table. 11 Chi-Square test of the relation between consistency and usability Hypotheses

Chi-Square

Sig. (P.value)

Df

Consistency affects Usability of WBERs

7.909

0.369

6

5.5 The relation between learner’s control and usability To test whether there is a relation between Learner’s control and usability test was performed. The results reported in Table (12) below shows that there was significant difference between Learner’s control and usability Table. 12 Chi-Square test of the relation between learner’s control and usability Hypotheses

Chi-Square

Sig. (P.value)

Df

Learner’s control affects Usability of WBERs

10.863

0.220

7

5.6 The relation between effectiveness and usability To test whether there is a relation between Effectiveness and usability test was performed. The results reported in Table (13) below shows that there was significant difference between Effectiveness and usability Table. 13 Chi-Square test of the relation between effectiveness and usability Hypotheses

Chi-Square

Sig. (P.value)

Df

Effectiveness affects Usability of WBERs

9.6

0.292

7

5.7 The relation between efficiency and usability To test whether there is a relation between Efficiency and usability test was performed. The results reported in Table (14) below shows that there was significant difference between Efficiency and usability Table. 14 Chi-Square test of the relation between efficiency and usability Hypotheses

Chi-Square

Sig. (P.value)

Df

Efficiency affects Usability of WBERs

6.869

0.401

6

5.8 The relation between satisfaction and usability To test whether there is a relation between Satisfaction and usability test was performed. The results reported in Table (15) below shows that there was significant difference between Satisfaction and usability Table. 15 Chi-Square test of the relation between satisfaction and usability Hypotheses

Chi-Square

Sig. (P.value)

Df

Satisfaction affects Usability of WBERs

12.794

0.171

7

6. CONCLUSION The main objective of this study was to propose a model for evaluating the usability of ERMS. Usability evaluation was done empirically and has been adopted by involving users who have regular interaction with the 106

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org system. The model consisted of effectiveness, efficiency, satisfaction, learnability, consistency, motivation, interactivity and learner's control as the factors that affect the usability. The model was tested and verified using questionnaires and experiments. The results showed that the Effectiveness, Efficiency and satisfaction affects the usability of ERMS. The results also showed that the Learnability, Motivation, Interactivity, Consistency and Learner’s control affects the usability of ERMS. The results also showed that the Age, Gender, Level of experience affects the usability of ERMS. The results of experiment showed that the system was efficient, effective and easy to use. ACKNOWLEDGEMENT Special thanks to Department of Information Technology, Faculty of Computer Science, University of Kassala in order support this research. REFERENCES 1.

2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27.

Andrina Granić, Vlado Glavinić* and Slavomir Stankov. 2004. Usability Evaluation Methodology for Web-based Educational Systems. Faculty of Natural Sciences, Mathematics and Education, University of Split. Ardito et al. 2005. An approach to usability evaluation of e-learning applications. Dix, A.J. Finlay, J.E., Abowd, G.D. & Beale, R. 2004. Human-Computer Interaction. 3rd Ed. Harlow Assex: Pearson Education Limited. Dubey S. K., A. Sharma and A. Rana, 2011. Usability evaluation in object oriented software systems using fuzzy logic approach, International Journal of Computer Science Issues, “in press” Gray & Set al. 1998. Damaged Merchandise? A Review of Experiments That Compare Usability Evaluation Method. J. Preece, D. Benyon, G. Davies, L. Keller and Y. Rogers. 1993. A guide to usability: Human factors in computing. Reading, MA: Addison-Wesley. José Alberto Lencastre & José Henrique Chaves. 2007. A usability evaluation of educational websites. Kashif Manzoor Qureshi, Muhammad Irfan. 2009. Usability evaluation of e-learning applications, A case study of Its Learning from a student’s perspective, Master Thesis. K. Harmeyer, UB. 2005. Usability Glossary: usability metrics, COSC324-05.ppt. Kunjal and Deepankar. 2005. Web Based Educational Systems: An Application of Information & Communication Technology. Maria A. Rentróia-Bonito, Tiago Guerreiro, André Martins, Vitor Fernandes, Joaquim Jorge. 2006. Evaluating Learning Support Systems Usability An Empirical Approach. Maggie McVay Lynch and John Roecker. 2007. Project Managing E-Learning: A Handbook for Successful Design, Delivery and Management. Melis, E., Weber, M. & Andrès, E. 2003. Lessons for (Pedagogic) Usability of eLearning Systems. Michael J. Miller. 1993. Usability in E-Learning. California-based LearnSpire Nielsen, J., Usability Engineering. 1993. San Diego: Academic Press,. Nielsen, J. 2000. Designing Web Usability: The Practice of Simplicity. New Riders: New York. Nick Fox, Amanda Hunn, Nigel Mathers. 2009. Sampling and Sample Size Calculation P. Georgiakakis et al 2005. Evaluating the usability of Web-based Learning Management Systems Preece, D. Benyon, G. Davies, L. Keller and Y. Rogers, 1993. A guide to usability: Human factors in computing. Reading, MA: Addison-Wesley Petri Nokelainen, 2004. An empirical assessment of pedagogical usability criteria for digital learning material with elementary school students. Ronan Fitzpatrick. 1999. Usable software and its attributes: a synthesis of software quality: European Community law and human-computer interaction. Roger E. Ki rk. 2004. Experimental Design Stamelos. 2003. Software evaluation problem situation. Shirish C. Srivastava, Shalini Chandra, Hwee Ming Lam. 2006. Usability Evaluation of E-Learning Systems. Ssemugabi, 2006. Usability evaluation of a web-based e-learning application: a study of two evaluation methods. Shirish C. Srivastava, Shalini Chandra and Hwee Ming Lam, 2006. Usability Evaluation of E-Learning Systems, Research paper. Said Hadjerrouit, 2010. Developing Web-Based Learning Resources in School Education: A UserCentered Approach, Interdisciplinary Journal of E-Learning and Learning Objects Volume 6.

107

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. 45.

46. 47.

SAS Institute Inc. 2005. Concepts of Experimental Design: Design Institute for Six Sigma. A SAS White Paper. TSE Wai-kin. 2003 Report on Usability Evaluation of Web-based Learning Sites Wong B, Nguyen TT, Chang E, and Jayaratna N 2003. Usability metrics for e-learning. workshop on human computer interface for semantic web and web applications. William Horton and Katherine Horton, 2000, A consumer’s guide for trainers, teachers, educators, and instructional designers, E-learning Tools and Technologies, book. William H. Mitchell, 2003, UML, the Unified Modeling Language, Software Engineering,. Zaharias, P. 2004. Usability and e-Learning: The road towards integration. ACM eLearn Magazine, Vol.2004 (6). Anonymous. Design of Experiments. 2000. Available at https://www.moresteam.com/toolbox/designof-experiments.cfm. Last access 4/3/2013. Anonymous E-Learning Management Electronic System. 2011. Available at http://elearning.kau.edu.sa/content.aspx. Last access 2/3/2013 Anonymous E-learning. 2012. Available at http://elearning.kau.edu.sa/content.aspx. Last access 2/3/2013. Gilbert Cockton. 2011. Usability Evaluation: Comparing Four Competing Models in E-Learning System Acceptance. Available at http://elearning.kau.edu.sa/content.aspx. Last access 2/3/2013. IEEE Std. 610.12. 1990. IEEE Standard Glossary of Software Engineering Terminology IEEE Std. 1061. 1998. Software Quality Metrics Methodology ISO/IEC, 9241-11. 1998. Ergonomic requirements for office work with visual display terminals (VDT)s - Part 11 Guidance on usability. ISO/IEC 9126-1:2001. 2000. Software engineering -- Product quality -- Part 1: Quality model. ISO/IEC 9126. 2003. Software Engineering – Product Quality Int’l Standard. ISO/IEC CD 25010. 2009. Software engineering - software product quality requirements and evaluation (SQuaRE) - quality model and guide. M.C.A. SEM. 2013. SYSTEM ANALYSIS AND DESIGN.. Available at www.mu.ac.in/.../MCA%20study%20material/M.C.A.%20(Sem%20 last access 3/3/2013. Will Wai-kit Ma (Hong Kong Shue Yan College, Hong Kong) and Allan H.K Yuen.2005 . Comparing Four Competing Models in E-Learning System Acceptance. Available at http://www.irmainternational.org. Jarel Remick. 2011. Web applications. Available at http://web.appstorm.net. Last access 14/5/2013. Anonymous. 2012. Web based applications. Available at http://www.techopedia.com. Last access 14/5/2013.

AUTHOR PROFILE

108

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

The RATE framework development from cognitive bias during requirement engineering processes 1

Muhammad Hassnain, 2Imran Ghani

1

Army Public College of Management & Sciences, Rawalpindi, Pakistan 2 Monash University, Malaysia 1 Email: [email protected], [email protected]

ABSTRACT Development of a framework for system’s reliability, accuracy, tolerance and estimation in the context of human factors is a substantial work. This research is aimed to see the impact of human factors on the requirement engineering processes. Proposed reliability, accuracy, tolerance and estimation (RATE) framework is composed of four components accompanied by the relevant sub-components. We aim to present the nominal scoring of all sixteen sub-components that involve during requirement engineering processes. Method execution provides the significant values of all sub-components and their retention in the framework. Results show that stakeholders of software industry have concerns from cognitive bias throughout the requirement engineering processes. Four components and relevant sub-components find their retention in framework as study participants also respond to keep these components and sub-components in the refined RATE framework. Keywords: RATE framework; cognitive bias; software damaging; requirement engineering processes; reliability; estimation; 1.

INTRODUCTION

Human design errors occur during software designing, software coding and human misinterpretation of requirements. Total numbers of faults made through software development life cycle are unknown to end users and they require estimation of these errors or faults. An error needs observation for detection before it is fixed. Software failures result into system failure, which are software errors. Corrected errors are errors, which are fixed but never integrated into current software version. Verified errors are corrected errors that are also verified for current released version of software [1]. Project management and software development are the knowledge intensive as human factors are involved in the each link. Fault forecasting is the more significant task that requires practical determination in the software engineering discipline. If a method of software damage rate forecasting is developed and applied, it can serve as a tool for a reliable software creation. To measure the faults created from human factors during requirement engineering is a popular research area that requires framing the cognitive bias, activities, errors or attributes in the non-functional requirements of a system. Determination of impact values of individual cognitive bias, errors or attributes and activities during the requirement engineering process is the concerning area of this research. This paper will examine the development of RATE framework; correlate the subcomponents of reliability, accuracy, tolerance and estimation components. The RATE framework will be evaluated by the results and their interpretation from participants’ responds in the software development industry. Software success or failure depends upon the decision making in such circumstances that may result into distorted perception and incorrect judgment. Correct decision making if found lacking during requirement engineering processes may result into software failures. Early software damage rate forecasting from cognitive bias, errors and activities requires a special attention of researchers. Software designers cannot design sound software projects if cognitive bias are left to be considered. 2.

RELATED WORKS

This section aims to review the related literature on research topic and explores the cognitive bias during the software requirement engineering processes. It is also aimed to review the previous research studies on the software estimation frameworks and models that incorporate the human factors. 2.1 Cognitive bias types Mental or cognitions behaviors, which prejudice the quality of decision for a number of judgments or decision making were called as cognitive biases. In a situation where tendency of selecting the pre-selected options over the

109

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org unselected are superior one option has been known as a default bias. Wilkenson and Klaesin in [2] defined the cognitive bias as systematic errors found in the human decision-making. Various cognitive biases were identified, but most of the research studies considered only one or two human factors. From previous research studies, some of the cognitive biases were found as the overlapping. Unrealistic optimist bias was resembled closely to valence effect in meaning. Because of the interconnected and similar biases, complexes of biases were worked out. Biasplexes had positive as well negative consequences on the project development [3]. Software defects were also caused by the pattern deviation found in human thought from laws of mathematics and logic. This assertion has been substantiated by the little empirical evidence to date [4]. 2.2 Cognitive system and human decision making In a research work [5] Zhang et. al, examined two cognitive bias factors that appeared in the loss aversion, conservatism and decision making in the real-world scenarios. Conservatism revealed that beliefs were adjusted insufficiently on appearance of new information. Judgment accuracy was affected by this type of bias. It also overestimated the low probabilities and underestimated the high probabilities. This bias resulted into an inaccurate decision making in real world scenarios. Impacts of loss aversion were shown that influenced the decision making. People tendency towards the disproportional preferring for loss avoidance was known as loss aversion that aimed for acquiring the gains. It was found that two cognitive biases co-existed in most situations and impacted the decision-making of people. In the same work, two estimation techniques, expected return (ER) and bootstrap were explored. Expected return technique helped the participants of study to show rationality in decision making while bootstrap made participants to become more consistent. Sometimes, such types of selection coincided the situations in a real world. Decision making became costly due to such cognitive bias. This work was one of the initial steps towards the development of cognitive systems, which could improve the human decision making. 2.3 Fuzzy set theory Gonzalez-Carrasco et al., in [6] found that theory of fuzzy sets only provided the information about uncertainty from human knowledge. Real time systems have embedded the use of Fuzzy set theory for different areas like decision making domains. Since, engineers and researchers preferred the use of fuzzy set theory. However, inherent uncertainties have produced the concerns of applications from using the fuzzy logic. Effort estimation was called the base for the accurate estimation of software systems. Therefore, project management in sense of minimizing the risk and better resource allocation is best facilitated by the effort estimation. Liu et al., noticed that root cause of modules’ failure was the human error. In order to overcome this issue, diversity in human factors played the effective role. Benefits obtained from diversity in human factors were difficult to quantify them. They developed an approach of human reliability analysis. This approach was based on a fuzzy logic model that mapped the common causes of a failure to a component degradation value [7]. Effort estimation for software development has remained very low that leads towards the poor management of project failures and plans. This bias is caused because information affects the software developers and information revealed is not relevant to the actual effort estimation. In a survey work [8] three sets of tests included participation as 374 developers from various outsourcing companies. They determined the connection between developers’ dimensions and estimation along several other factors of dimensions like thinking style, self-construal, skills, education, nationality and experience. It was found that most of studies’ dimensions were connected with estimation bias. At the higher interdependence, the estimation bias was found to be increased. 2.4 Naive bayes classifier Software estimation and prediction approaches using the quantitative based models were main contributions to control project risk at the early phase of software development life cycle. Uncertainty and instability concerned the software projects, but early estimation and prediction of limited data could be updated by the increment of new acquired data during the execution of data. Naive Bayes Classifier was tested for the effectiveness over the data collected from 104 projects. Application of Naive Bayes Classifier was compared to the Poisson regression model at all phases of the project development life cycle. Results of the experiments showed that Naive Bayes Classifier was more accurate than Poisson regression model to determine the area under curve (AUC). However, this model was limited when compared to other approaches with the different data sets [9].

110

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 2.5 Mental model Mental model was designed for keeping the concept of success. Success could be only achieved through contribution towards the achievement of success. Economic gain was the main objective of success that could not be controlled by project leaders and developers. Software developers could not reduce the impacts of political influence, and inappropriate deadlines. A concept of requirement compliance was introduced that measured the size of interaction among the defined and implemented requirements. Success was measured from the implemented requirements. Contrary to implemented requirements, all other unsolicited features and uncompleted requirements reduced the success of software projects [10]. 2.6 Software requirement engineering processes In [11] Mohanani et al., called that requirement engineering was referred to the collection of activities with the academic field where these activities were studied. Requirement engineering has broad definition that includes the activities like understanding, documenting, specifications, communication and modification of problems. Other aspects of requirement engineering included the users, goals, non-functional requirements and agents. In the same study researchers also investigated that designers’ creativity was affected by requirements. Minor changes in requirement engineering processes possessed the potential power due to sensitivity of designers. Designers’ sensitivity was related with framing effects of cognitive biases. Emotions were reported as key concerns for people behavior. Software engineering processes were called as the intensive and capital activity of human; hence emotion management was obviously a software profession. It was found that emotions were key factors, which were taken into account for keeping the stability in software requirement [12]. Marnewick et al., in [13] investigated the factors involved in delivering the poor quality of requirements. Human factors were found to be affecting the quality of requirements. An alternative view of communication skill was suggested to compensate the poor quality of requirements by trust relationship between customers and requirement engineers. This alternative view enabled the requirement engineers to elicit the high quality requirements from users. Communication as a human factor could not be resolved through the requirement solution. It was gained by talking with the experienced persons, because different people used a variety of communication tools. If a requirement engineer lacked the ability to communicate with customers that created the issue because requirement engineer was responsible to initiate the communication. 3.

PROPOSED MODEL

Idea behind the development of Reliability, Accuracy, Tolerance and Estimation (RATE) framework is to include the quality attributes of a software measurement in terms of cognitive bias impacts. RATE framework implies the Reliability, Accuracy, Tolerance and Estimation components. Proposed model is designed after a thorough literature review on cognitive bias role in software requirement engineering. Designed framework is aimed to forecast the damage rate produced from cognitive bias. The RATE framework as self-explanatory is intended to investigate the software reliability, accuracy, tolerance, and forecasting the software damage rate. Description of its components and sub-components is given in Table 1. There is a single sequence among all components of RATE framework that does not mean software is struck if any cognitive bias, requirement errors, quality attributes and activities performed are not measured. In Figure 1, four components of framework with their sub-components are shown. Software reliability is acquired through its all-time availability with self-descriptiveness, expandability and free of damaging. Software accuracy is acquired if communication error during requirement engineering process of requirement elicitation is avoided or handled properly. However, communication error handling alone cannot guarantee that a system is accurate. Elicited requirements need to be interpreted accurately to make the system’s working more accurate. Requirement interpretation can become impacted from anchoring and affects the software accuracy. Communication error, misinterpretation of requirement and anchoring achieved signify that how software is precise in its performance. Reliability and accuracy components achievement also requires the software tolerance as a separate requirement of a system as shown in RATE framework diagram. This component is specified towards the cognitive concerns, human bias and software perception. These factors impact the system tolerance that requires the neutralization of the impacts produced from such factors in the serious circumstances. In the final component of RATE framework, one requires to estimates these sub-components of all three components. Forecasting is achieved if socio-technical intervention and judgment are used fairly in order to make a decision. This framework will be

111

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org introduced in a number of software development organizations with the aim to structure the requirement engineering processes. This framework will work as a reference to train the requirement engineers, software analysts, developers and managers. The goal of measuring the human bias impacts during requirements engineering processes remains the main feature of proposed RATE framework. As the sequence shown in the Figure 1 our framework involves the requirement engineering processes. It encompasses the non-functional requirements; each NFR is related to four sub-components. The sub-components generalize their importance when requirement engineering processes are continuing. The degree of human bias, errors produced and activities involved in requirement elicitation, analysis, validation and documentation are covered. This framework uses the means to measure the impact value of cognitive bias ‘availability’ under the reliability component. Cognitive availability directly impacts the requirement elicitation processes. To test this proposition the, framework measures how cognitive availability influences the core nonfunctional features like self-descriptiveness and expandability. If cognitive availability is measured with the value more than 0.60, it has effects on the self-descriptiveness and expandability that results into damaging of software reliability. Anchoring bias in Accuracy component leads towards the appropriate solution but impacts from human errors in communications and misinterpretations of requirements during requirement elicitation, analysis and validation processes. Framework shows that if anchoring bias is adversely impacted from communication error and misinterpretation of requirements, precision in results of software is reduced. Cognitive concerns and human bias directly impact the software fault-tolerance capability. Framework measures the human bias, cognitive concerns and software size perception; if value exceeds than 0.60 the fault-tolerance NFR of a system is impacted. Estimation is a vital component of framework that forecasts the values of preceding components and sub-components. Cognitive-bias affects the judgment in decision making. Because, requirement engineering processes involve the correct judgment and decision making of clients as well as requirement engineers, socio-technical interventions overcome the problems produced from the cognitive bias. Table. 1 Description of RATE framework components and sub-components Components Reliability

Accuracy

Tolerance

Forecasting

Description with sub-components Reliability of system is fundamental non-functional requirement. Software reliability is derived from its sub-components of availability, self-descriptiveness, expandability and damaging. Software is reliable if it is available 24 hours/day. When software availability is ensured, it must show self-descriptiveness, and expandability features. Combined software availability, self-descriptiveness and expandability make the system free of any damaging about system reliability. Accuracy component is linked to preceding component of reliability, because reliable software is ensured to show the maximum accuracy. However, software has accuracy concerns due to communication errors between requirement engineers and clients, requirement misinterpretation and anchoring bias. In order to remove, say that 10% accuracy error, communication errors, requirement misinterpretation and anchoring bias concerns are handled to attain the 100% software accuracy. Software tolerance is linked to preceding framework component called accuracy. Software is tolerant if accuracy and reliability aspects are attained. However, system tolerance is impacted from cognitive concerns, human bias and software size perception. To overcome these impacting factors, input neutralization is applied in extreme situations to increase the software tolerance. Software reliability, accuracy and tolerance achieved through earlier three components of RATE framework are required to be quantified and estimated to forecast the damage rate if it exists there. Estimation component is dependent upon forecasting activity, socio-technical interventions, and judgment. Applying these factors properly, a best decision-making can result into RATE-ed requirements.

112

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure. 1 RATE framework 4.

RESEARCH METHODOLOGY

Primary aims behind the current research is to develop and understand the RATE framework from existing literature; use the framework to measure the cognitive bias, activities, and errors made during requirement engineering processes. A quantitative research method has been selected that concerns the responds of professionals from software industry. 4.1 Hypothesis development Hypothesis is based on the nominal scoring analysis as presented in the following: H0: Cognitive bias, errors and activities measured from non-functional requirements have no effects on the software development lifecycle. H1: Cognitive bias, errors and activities measured from non-functional requirement have effects on the software development life cycle. If H1 is accepted then attribute or errors estimation, activities measurement during requirement engineering processes informs about their impacts on the software quality. 4.2 Data collection Data has been collected from several experienced software requirement engineers, system analysts and project managers from the following software development organizations operating in Pakistan. 1. 2. 3. 4.

MAKABU Islamabad Quality Web Programmer Discretelogix WADIC

113

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 5.

Pakistan Revenue Automation Limited

All of the research study participants were sent invitation in the context of research objectives. When they accepted the invitation, the link of questionnaire was emailed to all of the participants. 4.3 Survey technique Online survey has been designed to collect information from target participants through a questionnaire. A comprehensive questionnaire includes the propositions about cognitive bias impacts, errors or attributes and related activities during requirement engineering processes as given in the Appendix B. Participants’ opinions about causes and impacts of software quality attribute are obtained by using the 5-Point Likert Scale. Table. 2 Participants designation Designation

Numbers

Requirement Engineer System Analyst

8

Project Managers

5

12

4.4 Validity Validity of research study is aimed to have the trustworthiness of results. Validity also refers that to what extent results are appropriate and subjective point of view of researchers has been biased [14]. Internal as well as external validities are maintained throughout the study. In the internal validity causal relations have been examined. Threat to internal validity has been controlled by grouping the related elements in the questionnaire. External validity is maintained by emphasizing upon the impacts of occurrence human bias, errors and activities during requirement engineering processes. 5.

RESULTS AND DISCUSSION

In Table 3, average nominal score of propositions relevant to software reliability, accuracy, tolerance and estimation is determined. Availability as a sub-component of reliability is a cognitive bias. We are investigating the impacts of cognitive bias during requirement engineering processes through a RATE framework. Therefore, we have merged the cognitive biases, errors occurred during requirement engineering processes, attributes and activities performed for forecasting the damage rate. In order to see whether activities, attributes or errors and cognitive have serious impacts on software quality, we have set criteria to include or exclude the above-mentioned factors. If average nominal score remains ≥ 0.60 for a factor, it retains its position in the refined RATE framework, otherwise it is rejected. Table. 3 Decision table from average nominal score Component

Sub-component

Average Nominal Score

Reliability

Availability Self-descriptiveness Expandability Damaging Communication Error Misinterpretation of Requirements Anchoring Precision Cognitive Concerns Human Bias Software Size Perception

0.63 0.65 0.64 0.60 0.69 0.69

Decision Retained in Final RATE Framework (Yes or Not) Yes Yes Yes Yes Yes Yes

0.63 0.64 0.68 0.65 0.69

Yes Yes Yes Yes Yes

Accuracy

Tolerance

114

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Estimation

Input Neutralization Forecasting Socio-Technical Interventions Judgment Correct Decision Making

0.64 0.68 0.67

Yes Yes Yes

0.67 0.63

Yes Yes

5.1 Discussion Reliability as a component of framework has availability, self-descriptiveness, expandability and damaging subcomponents. The focus has been made to investigate the relevancy of availability bias to reliability component. From results in Table 3, we can see that availability that is a cognitive bias also works as software availability. Upon the average nominal score of all sub-components, we can say that the propositions made about each of attribute, activity performed, cognitive bias and errors are true. Our claims of truth about the propositions are supported by the opinions, views and exact answers on a scale from requirement engineers, project managers and system analysts. Software reliability is ensured when requirements are self-descriptive. Expandability as another feature of reliability determines the success of a software when new extensions are made in the existing functions of a system. Software damage rate determination impacts the future expandability of systems’ requirements. From results, it has become clear that all four sub-components of Reliability component remain the essential part of RATE Framework, because four sub-components determine the software reliability. Sequence shown in the Figure 1 determines the interaction between cognitive bias, activities, errors and software quality attributes. For example, damaging is a risk produced from errors during requirement engineering processes. Requirement engineers, system analysts and project managers present their observation in real time projects. From results, it has become clear that a correlation exists between cognitive bias and software damage rate. Software damaging is a big issue that can be only avoided if software reliability is ensured through its non-functional features like selfdescriptiveness and expandability. Accuracy component of RATE Framework encompasses the communication error, misinterpretation of requirements, anchoring and precision. Software damaging is produced from impacts of human factors. It requires time to solve the issues emerged from poor communication. If communication error is not handled properly then poor quality requirements become the requirement document. In response to proposition “Does poor communication during requirement engineering processes produce the poor-quality requirements?” the nominal score is found to be 0.72. It confirms that study participants are seriously concerned about communication errors. In another proposition “Does Misinterpretation of requirements lead towards the software failure?” the nominal score is 0.75 that too shows concerns of requirement engineers, system analysts and project managers. However, requirement engineers must be conscious about anchoring cognitive. In response to proposition “Does Anchoring bias impact negatively to success of software?” we have achieved 0.55 that shows low impact of anchoring on software accuracy. On this individual proposition, we cannot exclude the anchoring bias because average nominal score 0.63 is found. The average nominal score has been taken from Appendix A. Precision is attained when preceding two errors and an anchoring bias are eliminated. Misjudge or misconception of requirements results into elicitation of low quality requirements. Sometimes, software elicitation personally from client cannot reveal the exact required functionalities of a system. Clients do not have sufficient knowledge about a system. It results into software failure. Software success is also impacted from the anchoring bias. In addition to anchoring bias, confirmation bias also impacts on the software quality. Anchoring and confirmation bias are taken as input for precision estimation. Fault tolerance is an important non-functional requirement (NFR) of a system. It has been included as a main component of RATE Framework. Cognitive concerns, human bias and software size perception are cognitive bias and input neutralization acts as a software quality attribute. Fault tolerance is a non-functional requirement of a system that is derived from requirement engineering processes. Human bias directly impacts on fault tolerance feature of a system. Software size perception produces the barrier for fault tolerance estimation of a system. Software development teams are concerned over the misperception of software size. Sometimes, requirement engineers recall their past experience when they come across the misconception of a software size. However, past

115

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org experience does not always produce the accurate and complete software estimation as it is tainted by human psychological effects. Now cognitive bias is focused to find the system neutralization from extreme inputs. Impacts of earlier three cognitive biases of Tolerance component of RATE framework are totally eliminated by the execution of input neutralization as a solution to make the system more faults tolerant. Among three cognitive biases, software size misconception is more emphasized because it is commonly used for measuring the software size in software development. However, cognitive concerns and human bias are also found to be impacting factors on software tolerance from average nominal score. Software damage rate estimation is a core of this research study that relates with other components of RATE Framework. Reliability, accuracy and tolerance as Non-Functional requirements are measured at the earlier phase of requirement engineering. Estimation as a component of RATE Framework has four sub-components such as forecasting, socio-technical interventions, judgment and correct decision making. Under the forecasting subcomponent, implementation of software damage rate forecasting is checked whether it avoids or not rework on the software application development. Forecasting becomes the input for sub-component socio-technical interventions. If cognitive concerns add the problems for accurate software damage rate forecasting, then socio-technical interventions may solve the issue of low quality requirements. It is known from results that socio-technical interventions can ameliorate the human factors, which have adverse impacts if not prevented. Human bias factors produce the errors and social-technical interventions that propagate during requirement engineering processes and diffuse these impacts of human bias factors. Judgment or perception is a human factor that impacts on the software forecasting. Better judgment can provide the accurate forecasting of software success. Requirement analysis and validation as key requirement engineering processes if handled through correct judgment can remove the errors produced in early phase of software development life cycle. The proposition “Do Requirement engineers make correct decisions to produce the better requirements for designing and coding?” is answered with the nominal score of 0.69 that signifies the importance of correct decision making to forecast the software damage rate. 5.2 Hypothesis acceptance From results and above given analysis have shown that hypothesis H1 is accepted while H0 hypothesis is rejected. Acceptance of hypothesis H1 clearly indicates that cognitive bias, errors and activities measured during the requirement engineering processes have influences over the quality of software that in turn determines the software damage rate during requirement engineering processes. 6.

CONCLUSIONS AND FUTURE WORK

Research is aimed to develop the RATE framework for the impacts of cognitive bias during the software requirement engineering processes. It is also aimed to see that how cognitive biases relate with reliability, accuracy, tolerance and estimation components of RATE framework. Several propositions about software estimation impacted from cognitive biases are presented. Cognitive bias including the confirmation, anchoring, adjustment, solution first bias and availability have been searched out from previous studies. It is also searched out that how these cognitive biases are connected with cognitive system models. Mental models and Naïve Bayes Classifier are also seen with the cognitive bias impacts. Area of cognitive bias impacts on the early phase of software development has remained a gap for earlier research works. Designing of RATE framework is described with detailed information with its components and sub-components. Reliability, accuracy and tolerance as non-functional requirements are further composed from cognitive bias and errors produced during requirement engineering processes. Cognitive bias impacts and errors are measured to forecast the success or failure of software. Average nominal scoring of all sixteen sub-components is determined to see their inclusion or exclusion from the refined RATE framework. In future work, levels of sub-components for RATE framework can be worked out. Impact level of cognitive bias is also a future research area that can be worked out to see the low level middle level and higher level impacting on cognitive bias, errors and activities. ACKNOWLEDGEMENT Special thanks to Abasyn University Islamabad Campus Pakistan and Army Public College of Management & Sciences Rawalpindi, Pakistan for providing support in order to complete this research.

116

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org REFERENCES 1. 2. 3. 4. 5.

6.

7.

8. 9.

10. 11. 12.

13.

14.

Neufelder, A. M. (2010). Ensuring Software Reliability. New York: Marcel Deckker Inc. Wilkinson, N., &Klaes, M. (2012). An Introduction to Behavioral Economics. New York, USA: Palgrave Macmillan. Ralph, P. (2011). Toward a Theory of Debiasing Software Development. Lecture Notes in Business Information Processing 93, 92-105. Calikli, G. (2013). Influence of confirmation biases of developers on software quality: an empirical study. Journal Software Quality Control, 21(2), 377-216. Zhang, Y., Bellamy, R. K. E., & Kellogg, W. A. (2015). Designing Information for Remediating Cognitive Biases in Decision-Making. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Seoul, Republic of Korea. Gonzalez-Carrascoa, I., Colomo-Palaciosa, R., Lopez-Cuadradoa, J. L., &Penalvo, F. J. G. (2012). International Journal of Computational Intelligence Systems. International Journal of Computational Intelligence Systems, 5(4), 679-699. Liu, J., Wang, K., Xu, A., & Wang, H. (2013). The Analysis of Common Cause Failure based on impact vector considering human factor diversity. Paper presented at the Industrial Electronics and Applications (ICIEA), 2013 8th IEEE Conference Melbourne, VIC. Jorgensen, M., &Grimstad, S. (2012). Software Development Estimation Biases:The Role of Interdependence. IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, 38(3), 677-693. Mori, T., S. Tamura, and S. Kakui. Incremental Estimation of Project Failure Risk with Naïve Bayes Classifier. inEmpirical Software Engineering and Measurement, 2013 ACM / IEEE International Symposium. 2013. Baltimore, MD: IEEE. Schneider, K., Liskin, O., Paulsen, H., & Kauffeld, S. (2013). Requirements compliance as a measure of project success. Paper presented at the Global Engineering Education Conference (EDUCON), Berlin. Mohanani, R., Ralph, P., &Shreeve, B. (2014). Requirements Fixation. Paper presented at the International Conference on Software Engineering 14, Hyderabad India. Colomo-Palacios, R., Casado-Lumbreras, C., Soto-Acosta, P., &García-Crespo, A. (2011). Using the Affect Grid to Measure Emotions in Software Requirements Engineering Journal of Universal Computer Science, 17(9), 1281-1298. Marnewick, A., Pretorius, J.-H., & Pretorius, L. (2011). A Perspective on Human Factors Contributing to Quality Requirements: a Crosscase Analysis. Paper presented at the Industrial Engineering and Engineering Management (IEEM), 2011 IEEE International Conference, Singapore. Runeson P., & Host, M. (2009). Guidelines for conducting and reporting case study research in software engineering, Empir Software Eng 14, 131-164.

AUTHOR PROFILE Muhammad Hassnain completed his MS Degree in Computer Science from Abasyn University Islamabad Campus Pakistan. He received his MCS from PMAS University of Arid Agriculture Rawalpindi. Currently, he is working as a Lecturer Software Engineering at Army Public College of Management and Sciences, Rawalpindi Pakistan. His research interests are in software engineering, requirements engineering, software testing and quality assurance. Dr. Imran Ghani is a Senior Lecturer at Monash University Malaysia Campus Malaysia. He received his Master of Information Technology degree from UAAR (Pakistan), M.Sc Computer Science from UTM (Malaysia) and Ph.D. from Kookmin University (South Korea). He his research interests are in software architecture and design, software engineering, secure software development, software application development using agile software development methods.

117

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org Appendix A: Nominal scoring Strongly Disagree

1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38.

Disagree

Neutral

Designations of Participants Experience in number of years 1x1=1 8x2=16 9x3=27 0x1=0 7x2=14 7x3=21 2x1=2 2x2=4 7x3=21 2x1=2 2x2=4 9x3=27 1x2=2 5x2=10 7x3=21 1x1=1 5x2=10 9x3=27 0x1=0 7x2=14 5x3=15 3x1=3 1x2=2 6x3=18 1x1=1 4x2=8 6x3=18 0x1=0 5x2=10 8x3=24 1x1=1 10x2=20 5x3=15 1x1=1 4x2=8 9x3=27 1x1=1 5x2=10 8x3=24 1x1=1 3x2=6 3x3=9 0x1=0 4x2=8 6x3=18 1x1=1 4x2=8 5x3=15 1x1=1 5x2=10 5x3=15 2x1=2 3x2=6 5x3=15 1x1=1 6x2=12 7x3=21 0x1=0 3x2=6 5x3=15 2x1=2 5x2=10 6x3=18 3x1=3 2x2=4 8x3=24 1x1=1 4x2=8 2x3=6 2x1=2 4x2=8 8x3=24 1x1=1 2x2=4 3x3=9 2x1=2 4x2=8 6x3=18 1x1=1 5x2=10 7x3=21 0x1=0 6x2=12 7x3=21 2x1=2 4x2=8 7x3=21 1x1=1 3x2=6 5x3=15 0x1=0 3x2=6 8x3=24 2x1=2 7x2=14 2x3=6 0x1=0 6x2=12 6x3=18 1x1=1 4x2=2 4x3=12 2x1=2 5x2=10 8x3=24 2x1=2 4x2=8 3x3=9

Agree

Strongly Agree

Total Points

Total Score

Nominal Score=Total Score/Total Points

3x4=12 9x4=36 9x4=36 5x4=20 6x4=24 6x4=24 11x4=44 9x4=36 7x4=28 8x4=32 2x4=8 7x4=28 4x4=16 9x4=36 7x4=28 8x4=32 7x4=28 12x4=48 2x4=8 9x4=36 5x4=20 5x4=20 12x3=36 7x3=21 12x4=48 9x4=36 5x4=20 5x4=20 6x4=24 9x4=36 5x4=20 7x4=28 8x4=32 11x4=44 4x4=16 9x4=36

2x5=10 1x5=5 3x5=15 4x5=20 4x5=20 2x5=10 1x5=5 1x5=5 1x5=5 2x5=10 4x5=20 3x5=15 5x5=25 5x5=25 5x5=25 3x5=15 4x5=20 1x5=5 7x5=35 5x5=25 4x5=20 4x5=20 2x5=10 1x5=5 3x5=15 1x5=5 3x5=15 2x5=10 3x5=15 4x5=20 5x5=25 3x5=15 2x5=10 1x5=5 3x5=15 2x5=10

23x5=115 24x5=120 23x5=115 22x5=110 23x5=115 23x5=115 24x5=120 20x5=100 19x5=95 23x5=115 22x5=110 24x5=120 23x5=115 21x5=105 22x5=110 21x5=105 22x5=110 23x5=115 23x5=115 22x5=110 22x5=110 22x5=110 21x5=105 22x5=110 21x5=105 22x5=110 21x5=105 20x5=100 22x5=110 22x5=110 21x5=105 21x5=105 22x5=110 21x5=105 22x5=110 20x5=100

1+16+27+12+10=6 6 0+14+21+36+5=76

66/115=0.57 76/120=0.63 78/115=0.68 73/110=0.66 77/115=0.67 72/115=0.63 78/120=0.65 64/100=0.64 60/95=0.63 66/115=0.57 64/110=0.58 79/120=0.66 76/115=0.66 77/105=0.73 79/110=0.72 71/105=0.68 74/110=0.67 76/115=0.66 77/115=0.70 82/110=0.75 70/110=0.64 71/110=0.65 61/105=0.58 60/110=0.55 77/105=0.73 69/110=0.63 67/105=0.64 63/100=0.63 70/110=0.64 78/110=0.71 75/105=0.72 65/105=0.62 72/110=0.65 64/105=0.61 67/110=0.61 65/100=0.65

2+4+21+36+15=78 2+4+27+20+20=73 2+10+21+24+20=7 7 1+10+27+24+10=7 2 0+14+15+44+5=78 3+2+18+36+5=64 1+8+18+28+5=60 0+10+24+32+10=6 6 1+20+15+8+20=64 1+8+27+28+15=79 1+10+24+16+25=7 6 1+6+9+36+25=77 0+8+18+28+25=79 1+8+15+32+15=71 1+10+15+28+20=7 4 2+6+15+48+5=76 1+12+21+8+35=77 0+6+15+36+25=82 2+10+18+20+20=7 0 3+4+24+20+20=71 1+8+6+36+10=61 2+8+24+21+5=60 1+4+9+48+15=77 2+8+18+36+5=69 1+10+21+20+15=6 7 0+12+21+20+10=6 3 2+8+21+24+15=70 1+6+15+36+20=78 1+6+24+20+25=75 2+14+6+28+15=65 0+12+18+32+10=7 2 1+2+12+44+5=64 2+10+24+16+15=6 7 2+8+9+36+10=65

118

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. 49. 50. 51. 52. 53. 54. 55. 56. 57. 58. 59. 60.

0x1=0 2x1=2 1x1=1 2x1=2 1x1=1 0x1=0 1x1=1 0x1=0 1x1=1 0x1=0 2x1=2 0x1=0 2x1=2 0x1=0 3x1=3 2x1=2 2x1=2 2x1=2 2x1=2 1x1=1 1x1=1 2x1=2

2x2=4 3x2=6 4x2=8 2x2=4 3x2=6 4x2=8 5x2=10 6x2=12 3x2=6 4x2=8 6x2=12 4x2=8 4x2=8 5x2=10 1x2=2 6x2=12 4x2=8 2x2=4 3x2=6 5x2=10 6x2=12 5x2=10

8x3=24 4x3=12 6x3=18 8x3=24 5x3=15 8x3=24 7x3=21 8x3=24 8x3=24 7x3=21 4x3=12 9x3=27 3x3=9 5x3=15 7x3=21 4x3=12 6x3=18 3x3=9 7x3=21 3x3=9 6x3=18 6x3=18

8x4=32 9x4=36 8x4=32 7x4=28 9x4=36 8x4=32 8x4=32 5x4=20 8x4=32 7x4=28 7x4=28 8x4=32 9x4=36 8x4=32 9x4=36 9x4=36 7x4=28 10x4=40 5x4=20 11x4=44 9x4=36 6x4=24

1x5=5 3x5=15 4x5=20 2x5=10 4x5=20 4x5=20 1x5=5 2x5=10 2x5=10 4x5=20 4x5=20 3x5=15 4x5=20 3x5=15 2x5=10 2x5=10 4x5=20 5x5=25 3x5=15 3x5=15 2x5=10 4x5=20

19x5=95 21x5=105 23x5=115 21x5=105 22x5=110 24x5=120 22x5=110 21x5=105 22x5=110 22x5=110 23x5=115 24x5=120 22x5=110 21x5=105 22x5=110 23x5=115 23x5=115 22x5=110 20x5=100 23x5=115 24x5=120 23x5=115

0+4+24+32+5=65 2+6+12+36+15=71 1+8+18+32+20=79 2+4+24+28+10=68 1+6+15+36+20=78 0+8+24+32+20=84 1+10+21+32+5=69 0+12+24+20+10=6 6 1+6+24+32+10=73 0+8+21+28+20=77 2+12+12+28+20=7 4 0+8+27+32+15=82 2+8+9+36+20=75 0+10+15+32+15=7 2 3+2+21+36+10=72 2+12+12+36+10=7 2 2+8+18+28+20=76 2+4+9+40+25=80 2+6+21+20+15=64 1+10+9+44+15=79 1+12+18+36+10=6 7 2+10+18+24+20=7 4

65/95=0.68 71/105=0.68 79/115=0.69 68/105=0.65 78/110=0.71 84/120=0.70 69/110=0.63 66/105=0.63 73/110=0.66 77/110=0.70 74/115=0.64 82/120=0.68 75/110=0.68 72/105=0.69 72/110=0.65 72/115=0.66 76/115=0.66 80/110=0.72 64/100=0.64 79/115=0.69 67/120=0.56 74/115=0.64

Appendix B: Survey Questionnaire Software Damage Rate Forecasting Dear Participants You are invited to take part in the study on RATE framework development for forecasting the software damage rate during the requirement engineering processes. This research study is aimed to see the impacts of human factors on the success and quality of software development in Software development industry of Pakistan. It is also investigated to find the human factors, which can be analyzed to enhance the software quality through software damage rate forecasting. This research study is conducted at Abasyn University Islamabad Campus to complete my degree of MS. Computer Science. Therefore, I request you to complete the questionnaire given in the following. The provided information will be used for research purposes. Thanks use following Rating criteria as to fill the questions from 3 to 60. 1= strongly disagree, 2= disagree, 3= neutral, 4=agree. 5= strongly agree Availability A sub component of Reliability Reliability A component of RATE Framework Does availability (recalling the information or disseminate from that just happened) heuristic impacts the software damage rate? Does availability bias remain consistent with all types of systems’ damage rate estimation? Does availability exist in anchoring of human issues?

119

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org Self-Descriptiveness A sub component of Reliability Does measurement of efforts for self descriptiveness support the software success or failure rate estimation? Does inadequate information have perceptual issues for a client that lacks understanding about software expansion in future? Does Self descriptiveness always supersede other software reliability features like consistency, correctness and completeness etc? Expandability A sub component of Reliability Do requirements expand-ability influences the software estimation? Does software damage rate estimation provide the right direction for requirements expand-ability in future? If requirement expand-ability is misjudged, then client suffers more than judgment authority? Damaging A sub component of Reliability Does a correlation between cognitive bias and software damage rate exit? Is it right that software damage rate can be determined through software requirement specification? Is software damage rate estimation from human factors a reliable forecasting? ACCURACY A component of RATE Framework Communication Error A sub-component of ACCURACY Do requirement changes result from poor communication between system stakeholders? Does Poor communication between client and requirement engineer impact the software estimation? Does poor communication during requirement engineering processes produce the poor-quality requirements? Does communication error produce the delayed software requirement specification? Does communication involve the human process that is not solved through the engineering solution? Does software damage result from communication error between Requirement Engineers and clients? Misinterpretation of Requirements A sub-component of ACCURACY Does requirement gathering phase provide accurate software damage rate estimation? Does Misinterpretation of requirements lead towards the software failure? Do good development teams develop good projects through better communication? Anchoring A sub-component of ACCURACY Does anchoring bias produces less accuracy in software damage rate estimation? Does Anchoring as information processing has close relation with mental simulation and story generation? Does Anchoring bias impact negatively to success of software? Confirmation bias occurring at requirement engineering processes impacts the software quality?

Precision A sub-component of ACCURACY Does anchoring and adjustment bias prevent from making precise estimation? Is Precision an important software quality measuring parameter? Is software accuracy highly derived from precision in software success/failure rate estimation? Does software accuracy depend upon the precision of software estimation?

Tolerance A component of RATE Framework Cognitive Concerns A sub-component of Tolerance Component Does fault tolerance of software derive from requirement engineering processes? Are cognitive concerns early perceived increase the level of system tolerance?

120

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org Is it necessary for requirement specification to be tolerant of the incompleteness? Do cognitive concerns of software impact the human judgment about software damage rate forecasting?

Human Bias A sub-component of Tolerance Component Human bias produces requirement inconsistencies, which reduce the fault tolerance level of software? Is human bias free of software fault tolerance more vital than technical faults? Does Requirement Engineer and Client face the challenge of judgment bias that increases the system in-tolerance? Reliable requirements come from understanding of stakeholders’ emotions. Emotions as Human behavior perception and judgment affect the software quality.

Software Size Perception A sub-component of Tolerance Component Does software size produce the barrier for system fault tolerance estimation? Mis-perception of software size produces the concerns for development team? Does sometimes self-perception enforce to recall the past experience? Recalling the past experience that can be incomplete and inaccurate is also tainted by the psychological effects?

Input Neutralization A sub-component of Tolerance Component Does cognitive bias neutralize the system from extreme inputs, which can tolerate the impacts of extreme inputs? Solution-first bias leads the system’s stakeholders to frame a solution for those human cognitive issues which they cannot understand? Does confirmation bias aggravate the solution first bias that leads towards the poor software design decisions?

Estimation A component of RATE Framework Forecasting A sub-component of Estimation Component Does cognitive based software estimation increase the efficiency of software damage rate estimation? Do you think that decision makers apply the information judgment to ensure the better software damage rate forecasting? Is damage rate forecasting implemented to avoid the rework on software applications? Is software failure/success forecasting more reliable at early phases of software development?

Socio-Technical Interventions A sub-component of Estimation Component Do socio-technical interventions, which ameliorate the impacts of human factors and resulting faults, increase the software quality? Interventions can be propagated during software requirement engineering processes to overcome the errors produced from biasing factors? Do socio-technical interventions require training to avoid the social issues and get skilled?

Judgment A sub-component of Estimation Component Is judgment a natural phenomenon or attained through experience in software industry? Does better judgment always reveal an accurate forecasting of software failure rate? Is human behavior influence its judgment capabilities in measuring the software damage rate?

Correct Decision Making A sub-component of Estimation Component Do Requirement engineers make correct decisions to produce the better requirements for designing and coding? Can correct decision making helps to estimate the software damage rate in an appropriate way? Can requirement analysis and validation through correct judgment remove the errors observed in early phase of software development life cycle?

121

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Usability of information systems software in Pakistan in users perspective 1

Ayesha Shafeeq 2Saqib Ali

1

Department of Information Technology, Faculty of Science Technology Government College Women University, Faisalabad Pakistan. 2 Department of Computer Science, Faculty of Sciences, University of Agriculture Faisalabad, Pakistan Email: [email protected], [email protected] ABSTRACT The growth of a software industry is rapidly high due to the advancement of the new technology. Software projects have to maintain their quality and user friendly characteristics in order to compete with the competition environment. Therefore, no one can deny the significant role of human computer interaction in the improvement and development of the information System Softwares. Human interaction properties such as perception, usability, efficiency and cognitive play an important role to deliver the customer oriented information system. This study aims to express the perception of Information Systems Usability with respect to end users. Usability means the use of system by end users with easiness to achieve their goal. Results show that in Pakistan, mostly information system developers did not concern about the Usability property of HCI in software development process which was the root cause of unfriendly and low quality Information system’s production. The overall usability of information systems in Pakistan is very low. It is suggested that a merging of Usability during the life cycle of software development could make more customer oriented, successful, user friendly and better quality information system. Keyword: Software projects; software usability; customer oriented; human computer interaction; HCI; software development; information systems; 1.

INTRODUCTION

The Software industry is like an umbrella which covers almost every part of human life and contributing in the development of software products. These software products are being used from the health sector to banking, from the education institute for civil products and for marketing and production of products [1]. To develop accurate and usable software products is now mandatory for these software development companies. Human Computer Interaction (HCI) plays an important role in the edifice of the human interaction system and computer in this arena of technology. Two terms are very important in HCI, one is the functionality and the other one is Usability. The efficiency of the system depends on the proper balance between these two components of HCI [2]. To complete the task of good software systems, the standards of Software Engineering and HCI should be merged and implemented. Improvements are made both in Software Engineering (SE) and HCI to meet the requirement of dynamic customers [3]. SE is mainly concerned about gathering of the requirements and understanding user needs to develop a process model in the design phase. Whereas, HCI deals especially with user review and develop user’s center process model [4]. Efficiency, effectiveness and satisfaction of a product are three corners of paragon of HCI according to the ISO 9241-11.Usability is an important term in the space of HCI.The main purpose of Usability is to make the system easy to learn for people, ensures the system functionality facilitates the task of people and make easy to use system [5]. The main objective of usability is Utility, Efficiency, Learnability, Attitude, Predictability, Synthesizability and Generalizability. For the measurement of system usability, System Usability Score (SUS) is adopted in this research. The system usability score was designed by John Brooke. It was a simple ten item Likert scale questionnaire that is used as subjective assessment of the system. It was a commonly used scale globally to measure the overall usability of the systems [6]. Usability has emerged to deal with problem of user frustration with faulty design Systems. HCI give more importance to human psychology and merge it with computer technology. The main aim of usability is to develop a user oriented system with full understanding of the users [7]. For the information system interface, usability plays a vital role.When users confront an unfriendly system, they feel a cognitive stress and unable to use the system effectively. System usability is very important in this rapid technology arena. But in Pakistan it has not given attention what it deserves. The main objective of this study is to perform usability testing on different information systems to uncover the problems relating to usability.

122

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 2.

LITERATURE REVIEW

Usability is the cornerstone of the paragon of Human Computer Interaction (HCI). Useless interfaces will cost a large amount for the companies because users could not use the system more effectively to complete their tasks [8]. HCI enhances the system efficiency with the help of user oriented design. Users will not able to get enough information form the useless interfaces. E-Learning organizations cannot implement the user usability on their websites and increase frustration on the behalf of users and administration [9]. In Usability, utility means users can attain their task which they want to carry out. Efficiency means ease of system, so that the user can carry out their task with minimum error. Learnability deals with the user’s ability to run the system with full competency to achieve maximum performance. Robustness means support for users toward their directed goal [10]. The properties of good user interface of the system are flexible, error recovery procedure, minimum training requirement and easy to adopt [11].With the invention of 3D, the complexity of the user interaction with the system is increasing. Usability problem leads to many problems like user’s cast off the system and the frustration of the user. The error rate is increasing with less usability [12]. Usability can be defined as in term of numerous attributes, but Nilsen considered Learnability, Efficiency, Memorability, Low Error and Satisfaction as the most significant attributes [13]. According to Rubin and Chisnell [14], usability may be included Usefulness, Efficiency, Effectiveness, Satisfaction and Accessibility attributes. Usability is an instrument that is used to quantify the user perception about the system. These systems consist of a website, or software systems and application interfaces or a software, operating device and management information systems [15]. Usability plays an important role in the success of the system or product. If a system functions well, but it has low usability, then system is on the edge of failure [16]. In this customer oriented era, end users decide the fortune of any system because at the end system is used by them and evaluated on the basis of usability. The acceptance of any system by its users is now built on the foundation of usability. If the system is not user friendly and difficult to use, users may reject this. The Quality of the system is interrelated with usability [17]. The System Usability Scale (SUS) is normally utilized after the respondent has had a chance to evaluate the system being used. Respondents ought to be requested that record their quick reaction and experience about the system. One of the advantages of the SUS is that it can be used to measure the usability of small sample size [18]. SUS consisted of two factors, namely usable item and learnable item. SUS is very popular instrument for the subjective assessment on end of test [19]. 3.

METHODOLOGY

A deep literature review is carried out to understand all the concepts associated with this research. Then the attributes about software system evaluation are filtered. This research is based on two stages. In a first step a questionnaire survey is conducted from different information systems users. Questionnaires were prepared for obtaining quantitative data. An online questionnaire was designed so that the users can find it over the internet and respond instantly. John Brooke System Usability Score is Evaluation Criteria. In a questionnaire 10 items with five responses are used for evaluating respondent’s response. Apply minus 1 from the scale position for odd items like 1, 3, 5, 7, and 9. Apply 5 minus the scale position for the even items such as 2, 4, 6, 8 and 10. The result is multiplied by 2.5 to get the overall usability of the system [20]. A survey questionnaire based on the SUS is shown in Figure 1. Sample size was of 100 different software system users from different cities of Pakistan. Each information system was evaluated from the users’ prospective in this scale. Demographies about the type of the selecting different systems are shown in Table 1. Table.1 Information System Type Type of information Frequency system MIS 48 DSS 15 Expert System 12 Artificial Intelligence 4 System TSS 21 Total 100

Percent 48.0 15.0 12.0 4.0 21.0 100.0

123

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure.1 System Usability Score 4.

RESULTS AND DISCUSSIONS 4.1 How an information system is used frequently?

Respondents were asked about the usage of the system, It was found that 37% reported as strongly disagree, 6% reported as disagree, 3% neutral , 7% agree and 47% reported as strongly agree as shown in Table 2. Table. 2 How frequently system is used?

Strongly disagree Disagree Neutral Agree Strongly Agree Total

Frequency

Percent

37 6 3 7 47 100

37.0 6.0 3.0 7.0 47.0 100.0

The graph of the system usage is shown in Figure 2.

124

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure. 2 Information system usage 4.2 Complexity of the information system Users should feel comfortable while using the system and this is possible only when the system is not complex.Table 3 shows that majority of the respondents strongly agreed about the complexity of the system. Only 16% respondents strongly disagreed. Figure 3 illustrates the respondent’s response in graphical form.

Figure. 3 Information system complexity

125

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org Table. 3 Complexity of the system Frequency

Percent

Strongly disagree Disagree Neutral Agree Strongly Agree

16 6 3 13 62

16.0 6.0 3.0 13.0 62.0

Total

100

100.0

4.3 Easy to use Respondents were asked about the easiness of the system. Only a small number of users agreed on the ease of system in use. The 32% of respondents reported strongly disagree, 16% as disagreed, 21% were neutral and 26% strongly agreed as shown in the Figure 4.

Figure. 4 Easy to use This result indicates that the Software developer has remained failing to cater the need of users. 4.4 Need of technical person Survey results indicate that there is need of extra effort to understand the system .They required a technical person to understand the system. Figure 5 shows that 48% were admitted to the need of a technical person while using the system and 25% were not required any technical help.

126

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure. 5 Need of technical person 4.5 Integration of the function Integration of the function means that all the component of the system working well and coordinates each other. Survey findings showed that out of 100, 48 % users were agreed on the integration as inference from Table 4. Table. 4 Integration of the function

Strongly disagree Disagree Neutral Agree Strongly Agree Total

Frequency 18 11 13 10 48 100

Percent 18.0 11.0 13.0 10.0 48.0 100.0

4.6 Inconsistency in the system 25% were disagreed,8% disagree.10% neutral,18% agree and 39 % were strongly agreed on the about the issue of inconsistency of the system as shown in the Figure 6. This showed that majority of the users feeling inconsistency regarding the system.

127

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure. 6 Inconsistency in the system For the development of good software system ,the developer should minimize the inconsistency in the System. 4.7 Learning of the system A good interactive system is quick and easy to learn. Users can easily learn and perform their task. Finding shows that 36% were strongly disagreed on liability feature of the system and 30% were agreed on it as shown in the Table 5. Table. 5 Learning of the System Strongly Disagree Disagree Neutral Agree Strongly Agree Total

Frequency 36 11 10 13 30 100

Percent 36.0 11.0 10.0 13.0 30.0 100.0

4.8 Cumbersome to use Users can not feel comfortable if the system is hard to use. There are required a lot time to understand the system, if the system is difficult. Figure 7 showed that 34% were disagreeing with the fact that the system was cumbersome and 30% were strongly agreed upon this.

128

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org

Figure. 7 Cumbersome system 4.9 Confident while using the system Figure 8 shows that out of 100, 38% were not feeling confident while using the system and 29% were strongly agreed.

Figure. 8 Users confident while using the system

129

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 4.10 Learn a lot from the system One of the characteristics of the successful system is that it can enhance the learnability of the users. 25% respondents did not get any advanced learning from the system,18% were disagreed,21% were neutral,13 were disagreed and 23 were strongly agreed as inference from Table 6. Table. 6 Learn from the system Strongly disagree Disagree Neutral Agree Strongly Agree Total

Frequency 25 18 21 13 23 100

Percent 25.0 18.0 21.0 13.0 23.0 100.0

4.11 Overall information systems usability In SUS the system whose usability is greater than 68 are considered above average. Table 7 shows that only 7 systems are falling on the usability criteria which shows that in Pakistan, there is a strong need of implementing HCI in the development process of the information system. Table. 7 Overall systems usability

Figure. 9 Overall systems usability

130

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 5. CONCLUSION AND RECOMMENDATIONS Results show that in Pakistan, usability of Information systems are very low. The main reason of this low usability is that rules of interaction were applied in the late stages of information system development. The User Center Design approach should be implemented in the process of information system development as shown in Figure 10. Involvement of the users will be enhanced and in the same time usability of the projects will also increase. The main goal of the user Centre design approach is to reflect the usability of the systems in terms of users prospective. ISO standard number 13407 approved the UCD for the interacting systems.

Figure. 10 User centre design approach Thus, it is clearly revealed that the usability of the systems can be achieved by merging HCI, user centered approach with the traditional software development life cycle. In the development process, the user is like a foundation which should not be forgotten. Users participation, Experiences, cognitive and psychological feature must be considered. ACKNOWLEDGEMENT The authors would like to pay special thanks to IT Empire Faisalabad, Pakistan and all the IT personals for providing support in order to carry out this research.The authors would also like to thank Department of Computer Science, University of Agriculture, Faisalabad, Pakistan for supporting this research. REFERENCES 1. 2. 3. 4.

Calp, M. H., & Akcayol, M. A. (2015). Risk Analysis and Achievement Levels of the Software Projects Realized at Technoparks. Journal of Duzce University Science and Technology, 3(1). Holzinger, A. (2005). Usability engineering methods for software developers. Communications of the ACM, 48(1), 71-74. Mayhew, D. J. (1999, May). The usability engineering lifecycle. In CHI'99 Extended Abstracts on Human Factors in Computing Systems (pp. 147-148). ACM. Rosson, M. B., & Carroll, J. M. (2002). Usability engineering: scenario-based development of humancomputer interaction.

131

JOURNAL OF SOFTWARE ENGINEERING & INTELLIGENT SYSTEMS ISSN 2518-8739 31st December 2016, Volume 1, Issue 2, JSEIS, CAOMEI Copyright © 2016 www.jseis.org 5. 6. 7.

8. 9. 10. 11. 12. 13. 14. 15. 16.

17. 18. 19.

20.

Gould, J. D., & Lewis, C. (1985). Designing for usability: key principles and what designers think. Communications of the ACM, 28(3), 300-311. Cappel, J. J., & Huang, Z. (2007). A usability analysis of company websites.Journal of Computer Information Systems, 48(1), 117-123. Marquis, K., Nichols, E., & Tedesco, H. (1998). Human-computer interface usability in a survey organization: getting started at the Census Bureau. InProceedings of the Section on Survey Research Methods, American Statistical Association (pp. 416-420). Shahid, S., & Abbasi, M. S. Usability Testing of an E-Learning System: A Comparative studyof two Evaluation Techniques. IOSR Journals (IOSR Journal of Computer Engineering), 1(16), 39-43. D. Squires, and J. Preece, Predicting quality in educational software: Evaluating for learning, usability and the synergy between them. Interacting with Computers, B.A. Myers, “A brief history of human-computer interaction technology”, ACM interactions, 5(2), pp 44-54 (1998). Ghahramani, B. (1998). U.S. Patent No. 5,724,262. Washington, DC: U.S. Patent and Trademark Office. Eason, K. D. (1984). Towards the experimental study of usability. Behaviour & Information Technology, 3(2), 133-143. Seffah, A., Mohamed, T., Habieb-Mammar, H., & Abran, A. (2008). Reconciling usability and interactive system architecture using patterns.Journal of Systems and Software, 81(11), 1845-1852. Rubin, J., & Chisnell, D. (2008). Handbook of usability testing: how to plan, design and conduct effective tests. John Wiley & Sons. Yengin, I., Karahoca, A., & Karahoca, D. (2011). E-learning success model for instructors’ satisfactions in perspective of interaction and usability outcomes. Procedia Computer Science, 3, 1396-1403. Faisal, C. M. N., Faridi, M. S., & Javed, Z. (2011, October). Usability evaluation of in-housed developed ERP system. In 2011 International Conference on Graphic and Image Processing (pp. 82850C-82850C). International Society for Optics and Photonics. Alshamari, M., & Mayhew, P. (2009). Technical review: Current issues of usability testing. IETE Technical Review, 26(6), 402-406. Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability evaluation in industry, 189(194), 4-7. Lewis, J. R., & Sauro, J. (2009, July). The factor structure of the system usability scale. In International Conference on Human Centered Design (pp. 94-103). Springer Berlin Heidelberg. Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the system usability scale. Intl. Journal of Human–Computer Interaction,24(6), 574-594.

AUTHOR PROFILE Ayesha Shafeeq is doing her MS degree in Computer Science from University of Agriculture, Faisalabad. She works as a lecturer in IT Department GCWUF. She got 3 rd position in MIT from Virtual University in 2014.She has also done MA English from Punjab University Lahore in 2007. Her research interests include human computer interaction, cyber security framework and cloud computing. Dr. Saqib Ali is an Assistant Professor at the Faculty of Sciences, Department of Computer Science ,University of Agriculture, Faisalabad. He is an HEC approved supervisor. His research interests are in networking, distributed computing technology, multi-channel multi-radio high speed wireless networks, network performance monitoring and interference optimization.

132