Computers in Human Behavior 29 (2013) 2776–2787
Contents lists available at ScienceDirect
Computers in Human Behavior journal homepage: www.elsevier.com/locate/comphumbeh
Predicting different conceptualizations of system use: Acceptance in hedonic volitional context (Facebook) Muhammad Z.I. Lallmahomed ⇑, Nor Zairah Ab.Rahim, Roliana Ibrahim, Azizah Abdul Rahman Faculty of Computer Science and Information Systems, Universiti Teknologi Malaysia, 81310 UTM Skudai, Johor, Malaysia
a r t i c l e
i n f o
Article history: Available online 9 August 2013 Keywords: System use Usage measures System use taxonomy Facebook Social networking sites (SNSs) UTAUT
a b s t r a c t This research examines the relationship between the predictors of use and the different conceptualizations of system use in a hedonic volitional setting (Facebook). Using the unified theory of acceptance and use of technology (UTAUT) model, an investigation into the three aspects of system use: the user, system and task were carried out. Results from a cross-sectional survey of 449 students show that behavioral intention has a significant influence on all aspects and dimensions of system use including cognitive absorption and deep structure use. Performance expectancy, effort expectancy and social influence are significantly related to system use. From the component model, performance expectancy is only significant with deep structure use. Hedonic performance expectancy is found to be significantly related to cognitive absorption. Results also demonstrate that predictors of usage have a significant relationship with the user aspect of system use. The variance explained in usage conceptualized as the user/task aspects is much higher than that of the system/task aspects or one-dimensional measures. Overall, conceptualizing system use using the user/task aspects offers greater explanatory power in Facebook use. Ó 2013 Elsevier Ltd. All rights reserved.
1. Introduction System use is one of the core construct in information system (IS) research (Benbasat & Zmud, 2003), that has received very little attention to date despite being around since the 1970s (Lucas, 1973). In IS success literature, system use is argued to be an essential component and listed as a key path through which benefits from a system can be obtained (DeLone & McLean, 1992, 2003). Straub, Limayem, and Karahanna (1995) contend that system use is a necessary pathway for managers to assess the impact of their systems, hence a surrogate in measuring professional performance. IS acceptance considers system use as the ultimate test from which the predictors of use will be trial against (Venkatesh, Brown, Maruping, & Bala, 2008; Venkatesh, Morris, Davis, & Davis, 2003) and is often correlated to attitude and behavioral intention (Wu & Wu, 2005). Davis (1993) argues that user acceptance is a key determinant in the success or failure of an IS, underlying the fact that acceptance models lead to the prediction of usage and from that use, success or failure can be known. Research that include system use in their model operationalized those measures in an omnibus fashion (Burton-Jones & Straub, 2006), which leaves us with little room for further analysis and deriving practical implications. Past measures of system use are too simplistic (DeLone &
⇑ Corresponding author. E-mail addresses:
[email protected] (M.Z.I. Lallmahomed),
[email protected] (N.Z. Ab.Rahim),
[email protected] (R. Ibrahim),
[email protected] (A.A. Rahman). 0747-5632/$ - see front matter Ó 2013 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.chb.2013.07.018
McLean, 2003; Gallivan, Spitler, & Koufaris, 2003; Hennington, Janz, Amis, & Nichols, 2009; Straub et al., 1995) and hence we lack in our understanding of the relationship between predictors of use and usage conceptualizations (Benbasat & Barki, 2007; Jasperson, Carter, & Zmud, 2005; Venkatesh et al., 2008). The large number of system use measures available in the literature has rendered effort to build a cumulative research tradition difficult, complicating effort to compare research findings (Straub et al., 1995). Several scholars have called for more research on system use; research on the construct itself and the relationships between its predictors and outcomes (Burton-Jones & Straub, 2006; Jasperson et al., 2005; Venkatesh et al., 2008). Current research on system use has been limited to utilitarian environments, office related applications. Burton-Jones and Straub (2006) proposed a definition of system use based on three aspects: a user, system and task. They tested their approach in the system use to performance nomology. Barki, Titah, and Boffo (2007) proposed the IS use-related activity (ISURA) construct to identify tasks that individuals do using information technology. Venkatesh et al. (2008) investigate the effects of three predictors of use; facilitating conditions, behavioral intention and behavioral expectation on frequency, duration and intensity of use. They found significant positive relationships with all three dimensions of system use. More recently, Sun and Teng (2012) developed the information system use (ISU) construct to understand usage in organizational settings, focusing on tasks carried out by employees. In this research, we plan to examine system use from an IS acceptance perspective in
M.Z.I. Lallmahomed et al. / Computers in Human Behavior 29 (2013) 2776–2787
a volitional hedonic setting. We aim to investigate the relationship between predictors of system use and usage dimensions/aspects in social networking sites (SNSs), particularly Facebook. System use is a critical component in understanding SNS adoption as the nature of these systems is that they require their subscribers’ interaction and engagement in order to be successful (Lin & Lu, 2011; Sledgianowski & Kulviwat, 2009). SNSs must keep their members actively involved if they are to attract top advertisers and gain revenue (Grabner-Kraeuter & Waiguny, 2011). Underutilization of SNSs may result in their rejection (Friedman, 2010; Gaudin, 2009; Perez, 2011), loss of revenue and even job layoffs (Chmielewski & Guyn, 2011; Diana, 2011). Moreover, the extent of the competition for advertisers, both internationally and locally as well as with companies such as Google or Yahoo, increases the difficulty of SNSs to acquire and retain subscribers (Chang & Zhu, 2011; Cusumano, 2011; Krasnova, Kolesnikova, & Günther, 2011). Furthermore, there has been a slowed growth in the number of SNSs (Chang & Zhu, 2011; Lankton, McKnight, & Thatcher, 2011) and half of new subscribers are reported to abandon their SNSs account after subscribing to these sites (Li, 2011). The long term viability of SNSs resides in their ability to gather new users and maintain existing ones (Bhattacherjee, 2001; Chang & Zhu, 2011; Kim & Oh, 2011). Hence, we argue that researching post-acceptance use will improve our understanding of system use aspects/ dimensions and identify areas that are underutilized. This will allow SNS administrators to develop possible interventions based on the use of their systems and ultimately help in gathering new subscribers and retaining existing ones. Therefore we ask three questions in that respect: (1) what dimensions of system use do IS Acceptance really predict? (2) What impacts do the three aspects of system use: user, system and task have on the UTAUT model? (3) Which aspects of use are relevant to Facebook? The rest of this paper is organized as follows: in the next section, we discuss system use aspects and dimensions. We present the research model and hypotheses in Section 3. In Section 4, the research methodology and data collection process is described. Section 5 reports that data analysis and results, followed by a discussion of these results in Section 6. In Sections 7 and 8, we present the limitations and implications of our findings. Finally, we provide conclusion in the last section.
2. Theoretical background 2.1. System use: Definition, dimension and aspect Burton-Jones and Straub (2006) define individual level system use as ‘‘an individual user’s employment of one or more features of a system to perform a task’’ (p. 231). This definition is based on three assumptions. It argues for a ‘user’ who would be the person to employ the system. The system denotes an information system that is being used in specific tasks. Tasks are activities performed by the user to achieve specific goals. System use is argued to be a construct whose dimensions and measures will vary according to the context of the research. Hence, not all aspects of use may be necessary and researcher can compromise between the user, system and task aspects (Burton-Jones & Straub, 2006). Dimensions of use refer to the types of system use employed (Venkatesh et al., 2008) e.g. duration, frequency and these dimensions may be assessing an aspect of use. Based on the works of Burton-Jones and Straub (2006); Lallmahomed, Ab.Rahim, Ibrahim, and Rahman (2011), each dimension (type) of system use can have different measures e.g. likert scale, direct input from the user or computer logs. In this research, ‘dimension’ refers to the different types of system use. Fig. 1 depicts a graphical representation of these relationships.
2777
Fig. 1. System use aspects and dimensions.
Venkatesh et al. (2008) use the word ‘conceptualizations’ of use instead of ‘dimensions’ in their study. System use has been commonly operationalized using unidimensional measures such as duration or frequency (Hennington et al., 2009). One-dimensional measures of use do not take into consideration the context of the research (DeLone & McLean, 2003) and oversimplify system use measures to imply more usage equals to more benefits (Doll & Torkzadeh, 1988). Fig. 2 shows the model used by Burton-Jones and Straub (2006) to assess system use in the performance nomology. They report significant relationship between usage and short term performance. Venkatesh et al.’s (2008) research shows that their model (see Fig. 3) explains 65% of duration of use, 60% in frequency of use and 60% in intensity of use. Venkatesh et al. (2008) recommend further research on other dimensions of use especially cognitive absorption and deep structure usage.
2.2. Cognitive absorption (user aspect of use) Cognitive absorption (CA) is defined as ‘‘the state of deep involvement with software’’ (Agarwal & Karahanna, 2000, p. 673) and has mainly been used as an antecedent of perceived ease of use and perceived usefulness in the technology acceptance model (TAM). Agarwal and Karahanna (2000) suggest that a user’s experience with the use of an IS helps in the user’s evaluation of such systems. Research on CA focuses mainly on investigating the relationship between CA and intention (Jia, Hartke, & Pearson, 2007; Lin, 2009). CA can have both positive and negative outcomes based on the intended use of a system (Jia et al., 2007). For instance, users experiencing CA will be more likely consider their interaction with an IT artifact as useful and effortless, thus increasing their use of such systems (Lin, 2009). This may be beneficial in a hedonic context like Facebook but may be unwanted in office related environment. Since CA is used as an indirect predictor of behavioral intention, Burton-Jones and Straub (2006) recast CA as a measure of user’s engagement during use. In line with Agarwal and Prasad (1998), Webster and Martocchio (1992), we argue that employing an IS in post-acceptance use will inevitably lead the user to
Fig. 2. Use to performance nomology. (Source: Burton-Jones and Straub, 2006)
2778
M.Z.I. Lallmahomed et al. / Computers in Human Behavior 29 (2013) 2776–2787
Fig. 3. Predictors of usage and use nomology. (Source: Venkatesh et al., 2008)
experience CA which may cause positive or negative results. This in turn will affect the user’s behavioral intention towards using the system. There may be a circular relationship between behavioral intention and CA. Based on these assumptions, the state of CA may reflect the user’s level of engagement with Facebook. A low CA may show that the user is not very satisfied with the IS and his intentions may be to stop utilizing the system in a voluntary setting or reduce his utilization in a mandatory setting. CA is based on five underlying dimensions: temporal dissociation, focused immersion, heightened enjoyment, control and curiosity (Agarwal & Karahanna, 2000). 2.3. Conceptualizing deep structure use Deep structure use is defined as ‘‘the degree to which a user employs deep structure features of the system in the task’’ (Burton-Jones & Straub, 2006, p. 238). In the context of system use, conceptualizing deep structure use assumes that each system or application has some underlying structure on which tasks can be modeled. Operationalizing deep structure use would require an understanding of those features that will match their underlying tasks. Facebook provides a list of its most popular features on the ‘Facebook popular features’ page (Facebook.com, 2012). From that list, we constructed items that would capture the overall use of Facebook. This resulted in a 24 items construct. In line with Burton-Jones and Straub (2006), we conceptualize the features on a mid-level specificity in order to balance parsimony with completeness. Since we constructed deep structure measures afresh, content validity of those measures need to be assessed. Content validity deals with the representativeness of the items used to evaluate a particular construct (Straub, 1989; Straub, Boudreau, & Gefen, 2004), i.e. whether the questionnaire items represent the content of the construct to be measured. Two steps can be used to assess content validity: a Q-Sort or content validity ratios (Straub et al., 2004). We opted to use content validity ratios (Lawshe, 1975). Using Facebook most popular features and Burton-Jones and Straub (2006) article as guidelines, we needed to find out the core/most essential features that are used with Facebook. Our preliminary instrument has 24 items. We distributed these items to 10 PhD students who are experienced Facebook users in line with Lewis, Snyder, and Rainer Jr. (1995); Straub et al. (2004). The students were asked to rate each of these items based on a 4 point likert scale, with ‘1 = Not Relevant’ and ‘4 = Essential’. From the results obtained, we calculated the content validity ratio (CVR) (Lawshe, 1975) using the formula:
CVR ¼ ðn N=2Þ=ðN=2Þ; where n is the frequency count of the number of judges rating an item either as ‘3 = Important’ or ‘4 = Essential’ and N is the total number of respondents. Lawshe (1975) used only ‘Essential’ in the calculation of CVR. In line with Lewis et al. (1995), we employed both ‘Important’ and ‘Essential’ in our calculation of CVR as they both denote positive outcomes by the judges and we wanted to capture the most important/
core features as well. The significance of each item was set at p < .05. It is interpreted as more than 50% of the judges rating an item as either essential or important (Lewis et al., 1995; Straub, 1989). For 10 judges, a minimum ratio of 0.62 would be required (Lawshe, 1975). Based on these criteria, we choose to include all items with a CVR of 0.6 and above. We also calculated the content validity index, which is the average percentage of overlap between the test items and the construct domain (Lawshe, 1975). It is basically the mean of the retained CVR. The content validity index is 0.78 which suggest that 78% of our retained items are representing the content of important and essential features used with Facebook. Thus, our final measure of deep structure use has been reduced from 24 items to 10 items. 3. Model and hypotheses Since every context is different and each aspect/dimension of use needs to be researched in their specific environment (BurtonJones & Straub, 2006), we plan to investigate the relationship among several system use aspects. Different models will be tested. Fig. 4 shows the base model for this research. It posits that acceptance predictors are related to post-adoptive system use. System use is conceptualized as a Type II model with reflective first order, formative second order construct (Diamantopoulos, Riefler, & Roth, 2008). The different aspects of system use include the user aspect which comprises of CA, the system aspect which contain frequency, volume, intensity, variety of use and the task aspect of use which includes deep structure use. Based on the works of Burton-Jones and Straub (2006), Lallmahomed et al. (2011), each dimension of system use is matched to their respective aspects of use. CA is adapted from Agarwal and Karahanna (2000). System aspect of use dimensions are adapted from Lallmahomed et al. (2011) typology of system use measures. Deep structure use is based on 10 items from our content analysis. Questionnaire items and dimensions of use are shown in Appendix B. 3.1. Performance expectancy Defined as the ‘‘individual belief that the system will help the user to achieve gains in the job performance’’ (Venkatesh et al., 2003, p. 447), performance expectancy has been mainly used as a predictor to intentions in several studies while few researchers have tested its effect on the behavior itself i.e. system use. The relationship between performance expectancy and usage remains unclear as well as the dimensions of system use that performance expectancy really predicts. Anderson, Schwager, and Kerns (2006) tested the UTAUT model in the use of Tablet PC and found significant positive relationship (b = .46, p < .01) between performance expectancy and system use while all other constructs proved to be non-significant. Research done by Pynoo et al. (2011) on acceptance of digital learning environment, shows that performance expectancy is a non-significant predictor of use when system use was measured as frequency but a significant predictor of use when measured as duration. Based on the above justifications, the trend demonstrates that performance expectancy has a positive relationship with system use despite the fact that research that tested this relationship is scant, hence we posit that: H1. Performance expectancy will have a positive effect on system use.
3.2. Effort expectancy Effort expectancy is defined as ‘‘the degree of ease associated from the systems’’ (Venkatesh et al., 2003, p. 450). Venkatesh et al. (2003)
M.Z.I. Lallmahomed et al. / Computers in Human Behavior 29 (2013) 2776–2787
2779
Fig. 4. Proposed model – acceptance to use nomology, dotted lines indicate relationships not posited in the original UTAUT model.
argue that effort expectancy is significant only at the early stages of introducing a new technology in both voluntary and mandatory context. Research that tested the direct link between effort expectancy and system use is scant. Anderson et al.’s (2006) research on Tablet PCs in academia with a sample size of 37 shows that effort expectancy has no significant relationship with usage (b = .20, p > .05). Zhou, Lu, and Wang (2010) assessed the same relationship in mobile banking adoption with a much larger sample size of 250 and found no significant relationship. To the contrary, Sapio et al.’s (2010) research in acceptance of digital television has yielded significant results. Possible explanations for these inconsistent results are given by Adams, Nelson, and Todd (1992), Davis, Bagozzi, and Warshaw (1989). They suggest that ease of use is an important determinant of use at early stages of the adoption process and later, becomes significant only through perceived usefulness. Another reason for the inconsistent results reported could be that effort expectancy may be predicting different dimensions of use. Based on the above data, the following hypothesis is drawn: H2. Effort expectancy will not have a positive effect on system use.
3.3. Social influence Defined as ‘‘the degree to which an individual perceives that important others believe he or she should use the new system’’ (Venkatesh et al., 2003, p. 451). This construct denotes the extent to which the user’s behavior is influenced by his social environment, either his peers, family or hierarchical order. Venkatesh et al. (2003) argue that social influence is only significant in mandatory context where the user is required to use the system and report a weak positive relationship between social influence and system use. Anandarajan, Igbaria, and Anakwe (2002), Chakraborty, Hu, and Cui (2008), Sapio et al. (2010), Venkatesh and Bala (2008), Zhou et al. (2010) found significant relationship between social influence and system use while Anderson et al. (2006), Kim, Park, and Lee (2007); Lee and Kim (2009), Van Raaij and Schepers (2008) report no significant relationship. The mixed results may be due to the underlying technology or the context in which those systems are being used (mandatory vs. voluntary). Since system use has being reconceptualized into three aspects, we argue for
more research between social influence and use in order to account for the mixed relationships in the literature. Hence, the following relationship is posited: H3. Social influence will have a positive effect on system use.
3.4. Facilitating conditions Venkatesh et al. (2003) define facilitating conditions as the ‘‘degree to which an individual believes that an organizational and technical infrastructure exists to support use of the system.’’(p. 453). Šumak, Polancˇicˇ, and Hericˇko (2010), Chang, Hwang, Hung, and Li (2007) argue that facilitating conditions such as hardware/software, technical support and being knowledgeable about the system will remove impediments to its use and thus help in the system’s utilization. Results produced by Venkatesh et al. (2003) show that facilitating conditions have a direct effect on use as posited in the UTAUT model. This explains the fact that a large body of research employing the complete UTAUT model, tested the relationship between facilitating conditions and usage. The overwhelming majority of articles have found positive significant relationship with system use (Chang et al., 2007; Im, Hong, & Kang, 2011; Kijsanayotin, Pannarunothai, & Speedie, 2009; Lin & Anol, 2008; Pai & Tu, 2011; Venkatesh et al., 2003; Venkatesh et al., 2008; Wang & Shih, 2009; Zhou et al., 2010; Šumak et al., 2010). Hence, we posit that: H4. Facilitating conditions will have a positive effect on system use.
3.5. Hedonic performance expectancy The main thrust of research in IS adoption has concentrated mainly on utilitarian perspective (Gu, Fan, Suh, & Lee, 2010; Van der Heijden, 2004), i.e. information systems that brings value in terms of performance gains. This is widely shown in the theories of IS acceptance, namely TAM and UTAUT. TAM was chiefly used for productivity-oriented gains while the arrival of new technologies such as the internet, which provides features for entertainment
2780
M.Z.I. Lallmahomed et al. / Computers in Human Behavior 29 (2013) 2776–2787
oriented gains, called for more research into the antecedents of hedonic IS (Gu et al., 2010; Moon & Kim, 2001). ‘‘The value of a hedonic system is a function of the degree to which the user experiences fun when using the system’’ (Van der Heijden, 2004, p. 696). The nature of a utilitarian system is to increase the user’s task performance while a hedonic system provides self-fulfilling pleasurable value which is designed to be an end in itself (Davis, Bagozzi, & Warshaw, 1992). Further, Lin and Lu (2011) suggest that in the context of a pleasure-oriented IS such as SNS, enjoyment plays an important role. Research in hedonic systems have produced mixed results, some researchers (Celik, 2011; Gu et al., 2010) have shown that utilitarian construct such as usefulness outweigh enjoyment while others (Moon & Kim, 2001) report the contrary. In SNSs, Lin and Lu (2011) found out that enjoyment produces a stronger significant effect than perceived usefulness on behavioral intention. Van der Heijden (2004) contend that pleasure variables are more appropriate in hedonic context than in task-oriented context. Sledgianowski and Kulviwat (2009) found that perceived playfulness is the strongest predictor of intention to use (b = .45, p < .01) in their model. They further report that a direct link between perceived playfulness and system use was observed (b = .15, p < .05). In this research, we conceptualize hedonic construct through hedonic performance expectancy which is the extent to which an activity is perceived to be enjoyable in its own right (Davis et al., 1992). The following hypothesis can be drawn from the literature review above: H5a. Hedonic performance expectancy will have a positive effect on behavioral intention.
H5b. Hedonic performance expectancy will have a positive effect on system use. 3.6. Behavioral intention In IS acceptance, behavioral intention is defined as the ‘‘degree to which a person has formulated conscious plans to perform or not perform some specified future behaviour’’ (Venkatesh et al., 2008, p. 484). Behavioral intention is conceptualized as an antecedent of behavior (Ramayah, Rouibah, Gopi, & Rangel, 2009) and a predictor for system use (Venkatesh et al., 2008). Based on our literature review, the relationship between behavioral intention and system use is a strong positive one. A meta-analysis of those relationships shows that TAM predicts 30% of system use (BurtonJones & Hubona, 2006) conceptualized as an aggregate measure (Benbasat & Barki, 2007). Venkatesh et al.’s (2008) research sought to predict the different conceptualization of system use. Their findings show that behavioral intention explains 45% of duration of use, 57% in frequency of use and 59% in intensity of use. They did not examine other dimensions of system use in their model. Hence we posit that: H6. Behavioral intention will have a positive effect on system use.
3.7. Other relationships posited Rosen and Kluemper (2008), Lin and Lu (2011), Kwon and Wen (2010) found that perceived usefulness, perceived ease of use are significantly related to intention to use SNSs. Sledgianowski and Kulviwat (2009) tested the relationships between perceived usefulness, ease of use, perceived playfulness, critical mass, perceived trust, subjective norm and behavioral intention. The authors (Sledgianowski & Kulviwat, 2009) found that except for subjective norm (negative significant effect), all other predictors of use exert
positive significant effects on intention to use SNS. Chang and Zhu (2011) found significant relationship between attitude, subjective norm and perceived behavioral control for both pre-adoption and post-adoption intention. Hence, we posit that: H7. Performance expectancy will have a positive effect on behavioral intention.
H8. Effort expectancy will have a positive effect on behavioral intention. H9. Social influence will have a positive effect on behavioral intention.
4. Research methodology 4.1. Data collection and measurement In order to test our hypothesis, we conducted a cross-sectional survey of Facebook users in a large public university in Malaysia with a student population of around 23,843. We used an online questionnaire tool provided by SurveyMonkey.com to build our survey. Questionnaires were sent to the students’ email address through the information technology department. Using a quota sampling, we collected 522 questionnaires. Items with missing values, duplicates and nonsensical answers were removed. This resulted in 449 usable responses for the data analysis. Our responses consist of 209 undergraduate students and 240 postgraduate students. Survey respondents were students that use Facebook. Variables were all measured on a 7 point likert scale ranging from ‘Strongly Disagree’ to ‘Strongly Agree’ with neutral in between. As the research was done on SNSs, measures were changed to fit the context. In order to preserve content validity, measures were adapted from previous research. All items covering the user aspect of system use were implemented using the full model of CA posited by Agarwal and Karahanna (2000). Measures of system use include frequency of use, which ask respondents how frequently they use Facebook per day from ‘Not at all’ to ‘Several times a day’. Volume of use was operationalized as the amount of time spent while using Facebook per day from ‘Less than 30 min’ to ‘From 3 to 4 h’. Intensity of use was measured by asking the user to rate their degree of use from ‘Very light’ to ‘Very Heavy’. More details on the construct are given in Appendix B. 5. Findings and discussion 5.1. Descriptive statistics Majority of our respondents were from 20 to 29 years old, with 58% male. The bulk of the respondents has been using Facebook for at least 3 years and most of them were either enrolled in a bachelor degree or undergoing postgraduate studies. Most respondents were local students. 4 respondents did not say whether they are local or foreigner, hence classified as missing. Table 1 shows the demographics of our respondents. 5.2. Measurement model We use SmartPLS version 2.0 (Ringle, Wende, & Will, 2005) to test our model. SmartPLS is categorized as component based structural equation modeling (SEM) software that implement partial least square (PLS) algorithms compared to covariance based SEM such as LISREL or AMOS (Urbach & Ahlemann, 2010). The proposed model in Fig. 4 depicts the latent variables as well as the compo-
2781
M.Z.I. Lallmahomed et al. / Computers in Human Behavior 29 (2013) 2776–2787 Table 1 Descriptive statistics.
Table 3 Internal Validity.
Descriptive statistics
Options
Frequency
Percentage (%)
Gender
Male Female 50 6 Local International Missing Diploma Bachelors Masters PhD
263 186 37 346 55 9 2 11 83 264 70 21 308 137 4 6 203 119 121
58.6 41.4 8.2 77.1 12.2 2.0 0.4 2.4 18.5 58.8 15.6 4.7 68.6 30.5 0.9 1.3 45.2 26.5 26.9
Age (years)
Experience (years)
Origin
Education
Table 2 PLS Loadings on 2nd Order Factor CA. CA dimensions
PLS outer loadings
Temporal dissociation Focused immersion Heightened enjoyment Curiosity Control
.55 .77 .82 .83 .75
Note: All loadings are significant at p < .05.
nents of the formative construct system use. SmartPLS does not offer color coding or difference in shapes between the independent and dependent variables. PLS has many advantages over covariance based SEM as it is well suited with small sample sizes, has less stringent assumptions on the distributions of the model and supports both reflective/formative constructs (Lee & Chen, 2010). PLS is also suitable in situations where the researcher’s goal is to analyze causal predictive relationships (Henseler, Ringle, & Sinkovics, 2009). For the above reasons, we opted for the use of SmartPLS rather than LISREL. Before any statistical conclusion can be made, convergent and discriminant validity need to be assessed. Convergent validity is established when each measurement item correlate highly with their underlying latent construct more than with other constructs while discriminant validity is inferred when the measurement items relates weakly with constructs other than the one that they are supposed to be related with. In PLS, convergent validity is assessed through the item reliability (Cronbach’s alpha), composite reliability and average variance extracted (AVE) (Zhang & Li, 2006). Discriminant validity can be examined by comparing the squared correlations between constructs and variance extracted for a construct (Fornell & Larcker, 1981). Next we run a confirmatory factor analysis to check for the first order construct loadings and significance. Since we operationalized CA as a second order factor, following the recommendations of Chin (1998), in line with Lin (2009), Magni, Susan Taylor, and Venkatesh (2010), the convergent validity of the first order factors (temporal dissociation, focused immersion, heightened enjoyment, curiosity and control) is determined by the strength of loadings of the first order factors on the second order factor CA. Table 2 shows that all CA dimensions loads highly on their second order construct except for
Constructs
AVE
Composite reliability
Cronbach’s alpha
Behavioral intention (BI) Cognitive absorption (CA) Deep structure use (DS) Effort expectancy (EE) Facilitating conditions (FC) Hedonic performance expectancy (HPE) Performance expectancy (PE) Social influence (SI) System aspect of use (Sys)
.92 .52 .56 .74 .70 .84
.97 .92 .92 .92 .87 .94
.96 .91 .90 .88 .79 .91
.76 .85 .61
.93 .94 .83
.89 .91 .69
temporal dissociation. Temporal dissociation was culled from further analysis. Internal consistency is assessed through composite reliability which is similar to Cronbach’s alpha. Recommended cut off value for internal consistency should be greater than 0.7 (Nunnally, 1967) and greater than 0.80 for composite reliability (Nunnally & Bernstein, 1994). Fornell and Larcker (1981) suggest that significant factors should load above 0.7 and the average extracted variance (AVE) should be 0.5 and above. All loadings above 0.7 were retained. Variety of use did not load above 0.7 and was dropped from further analysis. As shown in Table 3, the composite reliability of our measures is greater than 0.80 and the AVE exceeds the recommended value of 0.5, hence denotes acceptable convergent validity. Discriminant validity is established by the square root of the AVE which is greater than the correlations among all the constructs. As seen in Table 4, the square root of the AVE for each construct is far larger than the any correlations among the constructs suggesting that they share greater variance with their own measures than with other constructs (Fornell & Larcker, 1981). Loadings and cross-loadings (Appendix A) demonstrate that each item loads highly with their respective latent construct. Hence, the psychometric properties of our measuring instrument exhibit acceptable convergent and discriminant validity. System use is also operationalized as a second order formative construct. We implemented the second order formative measure of system use using the two step approach. First, we run a regression analysis using the first order factors as a reflective measure comprising of CA, volume, frequency, intensity and deep structure use. Then we used the factors scores obtained as a new formative measure of system use. In order to validate our formative measure, we check for multicollinearity. We used factor scores obtained in SmartPLS (Ringle et al., 2005) and ran a regression analysis in SPSS in line with Andreev, Heart, Maoz, and Pliskin (2009). The cut off value for collinearity is set at VIF > 10 (Gefen, Straub, & Boudreau, 2000) meaning all values below 10 is acceptable. VIF values ranges from 1.3 to 4.0 which suggests that multicollinearity was low and
Table 4 Correlations among latent construct.
BI CA DS EE FC HPE PE SI Sys
BI
CA
DS
EE
FC
HPE
PE
SI
Sys
.96 .65 .53 .42 .56 .61 .52 .46 .50
.72 .57 .57 .60 .67 .55 .59 .37
.75 .52 .52 .54 .59 .41 .38
.86 .56 .53 .57 .44 .33
.83 .50 .53 .44 .27
.89 .67 .50 .41
.87 .50 .34
.92 .23
.78
Note: Bold items on the diagonal represent the square root AVE.
2782
M.Z.I. Lallmahomed et al. / Computers in Human Behavior 29 (2013) 2776–2787
under the threshold of VIF < 10. We also checked the significance of the loadings. We used a bootstrapping methodology with 500 subsamples in order to generate t-statistics. All loadings are extremely significant at p < .001. 5.3. Main structural model In order to assess the structural model, the path coefficients and the variance explained in the dependant variable are examined (Maldonado, Khan, Moon, & Rho, 2011). A bootstrapping methodology was used with 500 sub-samples to generate t-statistics. Overall, the structural model (see Fig. 5) shows that all relationships are significant at p < .001 except for performance expectancy with system use. Effort expectancy has a non-significant relationship with behavioral intention in our model but a significant relationship with system use. CA, frequency of use, intensity of use, volume of use and deep structure use are significant and load highly on their formative construct system use. The overall explained variance in behavioral intention is 42% and that of system use is 71%. Next, we discuss the hypotheses posited and outcomes (see Table 5). Overall, 3 out of 10 hypotheses were not supported in the acceptance to use nomology. Moderate relationship has been found between hedonic performance expectancy and behavioral
Fig. 5. Main model (ns = nonsignificant, ⁄p < .05,
intention. This in inline with Lin and Lu (2011), Sledgianowski and Kulviwat (2009) who found that hedonic predictors have the strongest link with behavioral intention in SNSs. Effort expectancy has a non-significant relationship with behavioral intention. This can be explained by the fact that Facebook has achieved near ubiquitous use amongst technology savvy students, hence learning to use SNS does not contribute to the formulation of their intention to use Facebook in a post-acceptance environment. Even though effort expectancy is non-significant with behavioral intention, the latter is significant to system use suggesting ease of use associated with Facebook ultimately determines usage patterns of Facebook subscribers. Performance expectancy is significant with behavioral intention. This leads us to conclude that in a post-acceptance setting, the ability to share and communicate effectively is one of the factors leading to continued use. Social influence has also a positive relationship with the aggregate measure of system use suggesting that influence from peers has a direct effect on the students’ use of Facebook. Next we seek to investigate the effect of the usage predictors on the aspects of system use. From Table 6, we demonstrate that antecedents of system use better predict multidimensional measures of use as compared to unidimensional ones. For instance, the user/task aspects of use are better predicted than all other aspects with an explained vari-
⁄⁄
p < .01,
⁄⁄⁄
p < .001, R2 = variance explained).
Table 5 Hypotheses and results. Hypothesized paths
Path coefficient (T-values)
H1 H2 H3 H4 H5a H5b H6 H7 H8 H9
.08 .18 .14 .14 .41 .23 .29 .12 .06 .18
Performance expectancy will have a positive effect on system use Effort expectancy will not have a positive effect on system use Social influence will have a positive effect on system use Facilitating conditions will have a positive effect on system use Hedonic performance expectancy will have a positive effect on behavioral intention Hedonic performance expectancy will have a positive effect on system use Behavioral intention will have a positive effect on system use Performance expectancy will have a positive effect on behavioral intention Effort expectancy will have a positive effect on behavioral intention Social influence will have a positive effect on behavioral intention
Note: ns = nonsignificant. p < .05. ** p < .01. *** p < .001. *
(1.82)ns (4.98)*** (4.29)*** (4.18)*** (7.85)*** (5.32)*** (6.94)*** (2.28)* (1.12)ns (4.17)***
Outcome Not Supported Not Supported Supported Supported Supported Supported Supported Supported Not supported Supported
2783
M.Z.I. Lallmahomed et al. / Computers in Human Behavior 29 (2013) 2776–2787 Table 6 Model tested and variance explained. Measurement approach
Model
Results bPE/Use = .09ns
User and task aspects of use
R2Use = .69
bEE/Use = .18*** bSI/Use = .16*** bFC/Use = .16*** bHPE/Use = .24*** bBI/Use = .25*** bPE/Use = .05ns
User and system aspects of use
R2Use = .68
bEE/Use = .17*** bSI/Use = .20*** bFC/Use = .12** bHPE/Use = .30*** bBI/Use = .31*** bPE/Use = .25***
System and task aspects of use
R2Use = .51
bEE/Use = .18*** bSI/Use = .00ns bFC/Use = .09ns bHPE/Use = .09ns bBI/Use = .27*** bPE/Use = .01ns
Commonly used measures in the literature
R2Use = .21
bEE/Use = .15* bSI/Use = .06ns bFC/Use = .10ns bHPE/Use = .06ns bBI/Use = .40*** bPE/Use = .01ns
Uni-dimensional measure of use
R2Use = .17
bEE/Use = .17* bSI/Use = .11ns bFC/Use = .06ns bHPE/Use = .06ns bBI/Use = .37*** Note: b = path coefficient, R2Use = explained variance in system use, ns = nonsignificant. * p < .05. ** p < .01. *** p < .001.
Table 7 Predictors of use and system use dimensions. Acceptance predictors
System use dimensions CA
Frequency of use
Volume of use
Intensity of use
DS
PE EE SI FC HPE BI
.06ns .16** .24*** 015*** .31*** .24***
.01ns .17* .11* .07ns .07ns .36***
.04ns .08ns .02ns .12* .04ns .31***
.10ns .11+ .06ns .07ns .06ns .39***
.27*** .16** .02ns .13* .08ns .19***
Note: ns = nonsignificant. + p < .1. * p < .05. ** p < .01. *** p < .001.
ance of 69%. This shows a negligible change from the base model. All path coefficients are strongly significant except for performance expectancy. The user/system aspects have an explained variance of 68%. In the system/task aspects of use, social influence and facilitating conditions becomes non-significant while performance expectancy has a significant relationship with system use (b = .25). The explained variance the system/task aspects of use is 51%. Only two significant predictors; effort expectancy and behavioral intention are significant with commonly used measure of use with an explained variance of 21%. To account for the different relationships, we used the component model to investigate the different system use dimensions that are being predicted (Table 7). 6. Discussions The significance of the findings above demonstrates that each dimension of system use is being predicted by differing anteced-
2784
M.Z.I. Lallmahomed et al. / Computers in Human Behavior 29 (2013) 2776–2787
ents of use. Fig. 5 shows the relationships between the different predictors of use with an aggregate measure of system use. Table 7 demonstrates the same relationships at the component level. These findings give credence to the statement of Benbasat and Barki (2007), Jasperson et al. (2005) into opening up the ‘black box’ of system use rather than aggregating dimensions of system use into an overall omnibus measure. For instance, performance expectancy has a non-significant relationship with system use conceptualized as an aggregate measure (see Table 6) while Table 7 shows that performance expectancy has a significant relationship with deep structure use. This confirms that usage may be better research at the granular level. Our results ran contrary to previous research in that effort expectancy has been found to have a positive relationship with system use. Table 7 shows that effort expectancy is correlated with CA, frequency, intensity and deep structure use. Our research shows that there is a negative significant relationship between facilitating conditions and volume of use. These findings could be interpreted as the more facilitating conditions the user would have to use SNSs, the lesser is the amount of time spent employing such systems. Although more research is needed on this issue, this could mean that as facilities are increased, the notions that users would have to spend more time to access the information they are looking for are thereby decreased. Hence, once they obtained the data they require, they do not need to spend much time searching for other services. This ultimately can affect the users’ intention to continue using SNSs and reduce CA. Social influence has a negative relationship with frequency of use. These findings suggest that influence from acquaintances may reduce the time spent using Facebook. We interpret this to mean that important people to the students may have negative feelings about their frequency of using Facebook. Also, facilitating conditions have a significant effect with deep structure use suggesting that an increase in facilities available would increase user’s use of the different features in Facebook. Further, we know of no study other than Venkatesh et al. (2008) that tested conceptualizations of system use. Our findings contribute to the extant body of research as we conceptualize CA and deep structure use and show that behavioral intention has a significant relationship with both constructs. Our results ran contrary to that of Venkatesh et al. (2008) such that behavioral intention has a stronger relationship with frequency of use and intensity of use rather than duration of use. We further examined the relationships between the antecedents of use and those dimensions of system use in the UTAUT model. Our findings demonstrate that behavioral intention may not account for the total variance explained in system use despite the fact that acceptance theories posit that behavioral intention is the best predictor of use (Venkatesh et al., 2003). Our results confirm that antecedents of use other than behavioral intention may also have a direct effect on usage dimensions e.g. performance expectancy with deep structure use; effort expectancy with frequency of use. However, behavioral intention is shown to be the strongest predictor of system use. Overall, the main model explains 17% in variance in frequency of use, 13% in volume of use, 24% in intensity of use, 64% in CA and 47% of deep structure use. Our results show that in a post-acceptance environment, user and task based measures are better predicted than system based measures such as duration and frequency of use. There is very little change in explained variance between the main model (71%), user/ task aspects of use (69%) and user/system aspects of use (68%). System use should be conceptualized based on the three aspects if predictive validity is the goal of the research. If researchers want to better understand usage and its relationship with the predictors of use, then the user/task aspects would be a better choice. This research demonstrates the relationships between the predictors of use and CA to be extremely significant at p < .01 and p < .001.
This suggests that experiencing absorption in volitional systems is a key component in retaining users and fostering continued use. 7. Limitations The findings from this research should be read in light of the following limitations. First, this study employs students as respondent in hedonic context using SNSs; hence the results may not be applicable to utilitarian systems and office applications as well as the general Facebook users. Inferring generalization of those findings should be done cautiously as system use is by nature a construct that may differ with the context and systems being researched (Burton-Jones & Straub, 2006; DeLone & McLean, 2003). Further, we tested our model based on the most popular SNS, Facebook amongst a multitude available. Our data were collected at a single point in time. This research was carried out in a developing state and the results may not be generalized in developed states or western contexts. Second, this research examines the direct effect of system use in the acceptance to use nomology using the UTAUT model, moderating factors have not been accounted for. Moreover, in order to balance parsimony and completeness, the features employed with Facebook cannot be considered to be comprehensive as we used only the most essential features based on our content analysis. 8. Implications for practice Several interesting managerial implications can be derived. These findings can be useful to website administrators and IT managers of web based services in targeting specific usage levels for their system. For instance, if website administrators want users to spend more time on their website, they could work on features that would increase social influence (b = .18, p < .001) and hedonic performance expectancy (b = .41, p < .001) which are extremely significant with behavioral intention. They could do so by working on factors that would improve the feelings of pleasure using SNS through improved interfaces such as new interactive applications or allowing users to customize their walls. It can also be done by increasing social influence using peer pressure or incentives for existing members for any newly recommended friends that would also join the SNS. Efforts should be concentrated on factors that would lead to CA since it has a significant positive relationship with acceptance predictors in our model. Further, increasing enjoyment and captivating users through meaningful application should be a priority of SNS administrators. CA should be increase by providing application and tasks that really matters to SNS subscribers instead of a large body of applications that are rarely used. Customers tend to limit themselves only to few applications or features used (Jasperson et al., 2005). Pine, Peppers, and Rogers (1995, p. 103) state that ‘‘Customers . . . do not want more choices. They want exactly what they want – when, where, and how they want it. . .’’ Tradeoffs will have to be made based on the strategy devised by SNS administrators. Further, SNS administrators should make use of feature based measures such as deep structure use and cognitive measures such as CA to assess the usage of their systems beyond time based measures. Although time based measures fair poorly in our model with very little explained variance in system use, assessing usage should be measured in terms of a multidimensional measure of use covering all three aspects: the user, system and task at the granular level. SNS administrators should rely on those measures rather than unidimensional ones as the explanatory powers of one-dimensional measures are weak. Exploitive use may yield more information as to the part of their systems that are underutilized and where interventions would have to be made. CA could also be recasted as an antecedent of use in order to understand the features that would lead the user to be absorbed in the task at hand.
M.Z.I. Lallmahomed et al. / Computers in Human Behavior 29 (2013) 2776–2787
9. Conclusions and future research
Appendix B.
This study contributes to the extant body of research on system use in the acceptance literature. More research is required to determine whether these dimensions produce different results in other context such as utilitarian systems or other research streams and nomology. IS success is also an area where system use could be tested due to the mixed results obtained. In the continuance nomology, system use is not originally posited to affect continuance (Bhattacherjee, 2001), we argue for the inclusion of system use as it is impossible for users to form their intention to continue or discontinue using a system without having used it in the first place. Moreover, system use is also posited to lead to habit (Limayem & Hirt, 2003; Limayem, Hirt, & Cheung, 2007). In our research, predictors of use are extremely significant with CA. Scholars are encouraged to research how these measures fare in their respective nomological network and the conceptualizations of system use would be appropriate in different contexts. Further research on system use may help us better understand how an IS contributes to the desired outcomes and how to optimize usage (Venkatesh et al., 2008) especially in volitional web based services such as SNSs which are free to use by the public.
Questionnaire. Items
2785
Acceptance predictors (Venkatesh et al., 2003). Performance expectancy. 1. I find Facebook useful for sharing information and connecting with others. 2. Using Facebook improves my performance in sharing and connecting with others. 3. Using Facebook enhances my effectiveness in sharing and connecting with others. 4. Using Facebook allows me to share and connect with others more quickly. Effort expectancy. 1. 2. 3. 4.
I find using Facebook easy. It would be easy for me to become skilful at using Facebook Learning to operate Facebook is easy for me. Interacting with Facebook does not require a lot of my mental effort.
Appendix A. Social influence. Table A1. Table A1 Loadings and cross-loadings.
BI1 BI2 BI3 FI1 FI2 HE1 HE2 HE3 Co1 Co2 Co3 Cu1 Cu2 Cu3 DS66 DS67 DS68 DS69 DS70 DS71 DS72 DS73 DS74 EE1 EE2 EE3 EE4 FC1 FC2 FC3 HPE1 HPE2 HPE3 PE1 PE2 PE3 PE4 SI1 SI2 SI3 Frequency Volume Intensity
BI
CA
DS
EE
FC
HPE
PE
SI
Sys
.959 .965 .955 .443 .437 .591 .611 .602 .326 .337 .356 .463 .434 .498 .429 .366 .385 .411 .297 .375 .491 .431 .377 .363 .414 .331 .347 .511 .494 .388 .522 .582 .567 .365 .490 .514 .431 .437 .419 .429 .374 .338 .458
.659 .605 .611 .678 .694 .774 .822 .783 .622 .669 .677 .768 .692 .723 .506 .443 .417 .438 .383 .456 .446 .421 .346 .464 .547 .474 .462 .549 .501 .433 .612 .642 .598 .407 .489 .531 .477 .493 .546 .597 .235 .277 .347
.514 .514 .500 .313 .320 .523 .520 .518 .290 .390 .358 .448 .406 .415 .778 .754 .785 .791 .704 .772 .730 .713 .709 .451 .463 .483 .383 .439 .440 .429 .485 .496 .500 .482 .483 .559 .541 .364 .390 .378 .283 .270 .336
.433 .410 .377 .334 .317 .518 .473 .502 .292 .418 .368 .482 .393 .363 .512 .380 .378 .420 .328 .406 .319 .356 .371 .859 .849 .890 .840 .454 .540 .393 .486 .493 .478 .428 .478 .555 .516 .392 .395 .418 .261 .201 .300
.549 .545 .518 .371 .391 .475 .496 .517 .347 .342 .353 .510 .464 .414 .445 .376 .373 .427 .335 .387 .414 .349 .391 .416 .505 .516 .471 .879 .884 .732 .454 .487 .445 .417 .469 .528 .440 .380 .395 .431 .200 .155 .267
.617 .583 .546 .391 .349 .658 .645 .704 .310 .381 .386 .489 .425 .493 .455 .414 .424 .409 .340 .360 .420 .401 .382 .477 .451 .463 .425 .450 .451 .353 .928 .946 .883 .507 .593 .615 .607 .423 .435 .452 .277 .253 .357
.532 .492 .473 .323 .302 .486 .502 .535 .258 .358 .309 .441 .369 .411 .521 .487 .470 .486 .363 .409 .383 .438 .418 .494 .509 .516 .436 .445 .483 .410 .665 .623 .556 .833 .889 .923 .837 .452 .462 .462 .222 .227 .337
.452 .421 .464 .380 .357 .466 .500 .470 .406 .377 .467 .468 .360 .433 .329 .312 .292 .353 .258 .297 .294 .319 .293 .398 .398 .368 .329 .370 .398 .324 .436 .440 .431 .348 .483 .476 .415 .915 .951 .901 .125 .194 .215
.495 .485 .469 .288 .319 .388 .362 .376 .067 .156 .118 .218 .257 .310 .279 .251 .264 .270 .281 .272 .401 .266 .292 .302 .260 .278 .293 .289 .259 .114 .335 .363 .357 .262 .301 .316 .307 .234 .203 .197 .745 .754 .846
1. Most people who are important to me think I should use Facebook. 2. Most people who are important to me would want me to use Facebook. 3. People whose opinions I value would prefer me to use Facebook. 4. I use Facebook because a lot of my friends are already using it.(dropped) Facilitating conditions. 1. I have the necessary resources to use Facebook. 2. I have the necessary knowledge to use Facebook. 3. I can consult my friends to help me if I have difficulty using Facebook. 4. I can consult Facebook Help Center if I have difficulty using Facebook. (dropped). Hedonic performance expectancy (Sweeney & Soutar, 2001; Yang, 2010). 1. I believe that using Facebook is enjoyable. 2. I have fun using Facebook. 3. The actual process of using Facebook is pleasant. Behavioral intention (Venkatesh, Thong, & Xu, 2012). 1. I intend to use Facebook in my daily life. 2. I predict I would use Facebook in my daily life. 3. I plan to use Facebook in my daily life. Cognitive absorption (Agarwal and Karahanna, 2000). Temporal dissociation (dropped). 1. 2. 3. 4.
Time appears to go by very quickly when I am using Facebook. Sometimes I lose track of time when I am using Facebook. Time flies when I am using Facebook. Most times when I get onto Facebook, I end up spending more time that I had planned.
2786
M.Z.I. Lallmahomed et al. / Computers in Human Behavior 29 (2013) 2776–2787
Focused immersion. 1. While using Facebook, I am absorbed in what I am doing. 2. While on Facebook, I am immersed in the task I am performing. Heightened enjoyment. 1. I have fun interacting with Facebook. 2. Using Facebook provides me with a lot of enjoyment. 3. I enjoy using the Facebook. Curiosity. 1. Using Facebook excites my curiosity. 2. Interacting with Facebook makes me curious. 3. Using Facebook increases my imagination. Control. 1. I feel that I have control over my interaction with Facebook. 2. I feel that I control sharing and connecting with my friends on Facebook. 3. Facebook allows me to control my computer interaction. System aspect of use (Lallmahomed et al., 2011). Frequency of Use. On average, how frequently do you use Facebook? Volume of use. On an average day, how much time do you spend using Facebook? Intensity ofuse. How would rate your intensity of use of Facebook? Very Light 1
2
3
4
5
6
Very heavy 7
Variety of use. How many different Facebook applications do you regularly use? Please tick all that apply. Wall, notes, events, chat, payment, games, discussions, messages. Groups, photos, fan page, news feed, subscribe, timeline. Deep structure use. When I use Facebook, I use features that allow me to: 1. Express and present myself and my experiences on Facebook. (dropped). 2. See what’s new with my friends on Facebook. 3. Send instant messages to my friends on Facebook. 4. Send private messages to my friends on Facebook. 5. Keep in touch with important groups in my life on Facebook. 6. Organize gatherings on Facebook. 7. Find people and content on Facebook. 8. Post-external content (e.g. youtube) on Facebook. 9. Give positive feedback on my friend’s post and experiences. 10. Follow people, business, events I am interested in.
References Adams, D. A., Nelson, R. R., & Todd, P. A. (1992). Perceived usefulness, ease of use, and usage of information technology: A Replication. MIS Quarterly, 16(2), 227–247.
Agarwal, R., & Karahanna, E. (2000). Time flies when you’re having fun: Cognitive absorption and beliefs about information technology usage. MIS Quarterly, 24(4), 665–694. Agarwal, R., & Prasad, J. (1998). The antecedents and consequents of user perceptions in information technology adoption. Decision Support Systems, 22(1), 15–29. Anandarajan, M., Igbaria, M., & Anakwe, U. P. (2002). IT acceptance in a lessdeveloped country: A motivational factor perspective. International Journal of Information Management, 22(1), 47–65. Anderson, J. E., Schwager, P. H., & Kerns, R. L. (2006). The drivers for acceptance of Tablet PCs by faculty in a college of business. Journal of Information Systems Education, 17(4), 429–440. Andreev, P., Heart, T., Maoz, H., & Pliskin, N. (2009). Validating formative Partial Least Squares (PLS) models: Methodological review and empirical illustration. ICIS 2009 Proceedings, 193. Barki, H., Titah, R., & Boffo, C. (2007). Information system use-related activity: An expanded behavioral conceptualization of individual-level information system use. Information Systems Research, 18(2), 173–192. Benbasat, I., & Barki, H. (2007). Quo Vadis TAM? Journal of the AIS, 8(3), 211–218. Benbasat, I., & Zmud, R. W. (2003). The identity crisis within the IS discipline: Defining and communicating the discipline’s core properties. MIS Quarterly, 27(2), 183–194. Bhattacherjee, A. (2001). Understanding information systems continuance: An expectation-confirmation model. MIS quarterly, 25(3), 351–370. Burton-Jones, A., & Hubona, G. S. (2006). The mediation of external variables in the technology acceptance model. Information & Management, 43, 706–717. Burton-Jones, A., & Straub, D. (2006). Reconceptualizing system usage: An approach and empirical test. Information Systems Research, 17(3), 228–246. Celik, H. (2011). Influence of social norms, perceived playfulness and online shopping anxiety on customers’ adoption of online retail shopping. An empirical study in the Turkish context. International Journal of Retail & Distribution Management,, 39(6), 390–413. Chakraborty, I., Hu, P. J. H., & Cui, D. (2008). Examining the effects of cognitive style in individuals’ technology use decision making. Decision Support Systems, 45(2), 228–241. Chang, I., Hwang, H. G., Hung, W. F., & Li, Y. C. (2007). Physicians’ acceptance of pharmacokinetics-based clinical decision support systems. Expert Systems with Applications, 33(2), 296–303. Chang, Y. P., & Zhu, D. H. (2011). Understanding social networking sites adoption in China: A comparison of pre-adoption and post-adoption. Computers in Human Behavior, 27(5), 1840–1848. Chin, W. W. (1998). The partial least squares approach for structural equation modeling. In George A. Marcoulides (Ed.), Modern methods for business research (pp. 295–336). Hillsdale, NJ: Lawrence Erlbaum Associates. Chmielewski, D. C., & Guyn, J. (2011). Myspace layoffs are part of broad restructuring. . Cusumano, M. A. (2011). Platform wars come to Social Media. Communications of the ACM, 5(4), 31–33. Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: A comparison of two theoretical model. Management Science, 35(8), 982–1003. Davis, F. D. (1993). User acceptance of information technology: System characteristics, user perceptions and behavioral impacts. International Journal of Man-Machine Studies, 38(3), 475–487. Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1992). Extrinsic and intrinsic motivation to use computers in the workplace. Journal of Applied Social Psychology, 22(14), 1111–1132. DeLone, W., & McLean, E. (1992). Information systems success: The quest for the dependent variable. Information Systems Research, 3(1), 60–95. DeLone, W., & McLean, E. (2003). The DeLone and McLean model of information system success: a ten-year update. Journal of Management Information Systems, 19(4), 9–30. Diamantopoulos, A., Riefler, P., & Roth, K. P. (2008). Advancing formative measurement models. Journal of Business Research, 61(12), 1203–1218. Diana, A. (2011). MySpace Planning Massive Layoffs. . Doll, W. J., & Torkzadeh, G. (1988). The measurement of end-user computing satisfaction. MIS Quarterly, 12(2), 259–274. Facebook.com (2012). Most Popular features. . Fornell, C., & Larcker, D. (1981). Structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18(1), 39–50. Friedman, J. (2010). Twitter, Facebook soar as Myspace sags in US market share. . Gallivan, M. J., Spitler, V. K., & Koufaris, M. (2003). Does information technology training really matter? A social information processing analysis of coworkers’ influence on IT usage in the workplace. Journal of Management Information Systems, 22(1), 153–192. Gaudin, S. (2009). Twitter, Facebook keep soaring as Myspace slumps. . Gefen, D., Straub, D. W., & Boudreau, M. C. (2000). Structural equation modeling techniques and regression: guidelines for research practice. Communications of the Association for Information Systems, 4(7), 1–79.
M.Z.I. Lallmahomed et al. / Computers in Human Behavior 29 (2013) 2776–2787 Grabner-Kraeuter, S., & Waiguny, M. (2011). Why users stay in online social networks: Perceptions of value trust, subjective norm, and the moderating influence of duration of use. In: AMA winter educators’ conference proceedings american marketing association (pp 204–249). Austin, Texas, USA. Gu, J. C., Fan, L., Suh, Y. H., & Lee, S. C. (2010). Comparing utilitarian and hedonic usefulness to user intention in multipurpose information systems. CyberPsychology, Behavior, and Social Networking, 13(3), 287–297. Hennington, A., Janz, B., Amis, J., & Nichols, E. (2009). Information systems and healthcare xxxii: understanding the multidimensionality of information systems use: a study of nurses’ use of a mandated electronic medical record system. Communications of the Association for Information Systems, 25(1), 25. Henseler, J., Ringle, C. M., & Sinkovics, R. R. (2009). The use of partial least squares path modeling in international marketing. Advances in International Marketing, 20, 277–319. Im, I., Hong, S., & Kang, M. S. (2011). An international comparison of technology adoption testing the UTAUT model. Information & Management, 48, 1–8. Jasperson, J., Carter, P. E., & Zmud, R. W. (2005). A comprehensive conceptualization of post-adoptive behaviors associated with information technology enabled work systems. MIS Quarterly, 29(3), 525–557. Jia, R., Hartke, H., & Pearson, J. (2007). Can computer playfulness and cognitive absorption lead to problematic technology usage? In: ICIS 2007 Proceedings AIS Electronic Library (AISeL) twenty eighth international conference on information systems (pp. 1–15). Montreal, Canada. Kijsanayotin, B., Pannarunothai, S., & Speedie, S. M. (2009). Factors influencing health information technology adoption in Thailand’s community health centers: Applying the UTAUT model. International journal of medical informatics, 78(6), 404–416. Kim, B. G., Park, S. C., & Lee, K. J. (2007). A structural equation modeling of the Internet acceptance in Korea. Electronic Commerce Research and Applications, 6(4), 425–432. Kim, B., & Oh, J. (2011). The difference of determinants of acceptance and continuance of mobile data services: A value perspective. Expert Systems with Applications, 38(3), 1798–1804. Krasnova, H., Kolesnikova, E., & Günther, O. (2011). One size fits all? Managing trust and privacy on social networking sites in Russia and Germany. In: ECIS 2011 Proceedings. . Kwon, O., & Wen, Y. (2010). An empirical study of the factors affecting social network service use. Computers in Human Behavior, 26(2), 254–263. Lallmahomed, M. Z. I., Ab.Rahim, N. Z., Ibrahim, R., & Rahman, A. A. (2011). A preliminary classification of usage measures in information system acceptance: a Q-Sort approach. International Journal of Technology Diffusion, 2(4), 25–47. Lankton, N. K., McKnight, D. H., & Thatcher, J. B. (2011). The moderating effects of privacy restrictiveness and experience on trusting beliefs and habit: an empirical test of intention to continue using a social networking website. IEEE Transactions on Engineering Management, 1–12. Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel Psychology, 28, 563–575. Lee, S., & Kim, B. G. (2009). Factors affecting the usage of intranet: A confirmatory study. Computers in Human Behavior, 25(1), 191–201. Lee, S. M., & Chen, L. (2010). The impact of flow on online consumer behaviour. Journal of Computer Information Systems, 1–10. Lewis, B. R., Snyder, C. A., & Rainer, R. K. Jr., (1995). An empirical assessment of the information resource management construct. Journal of Management Information Systems, 199–223. Li, D. C. (2011). Online social network acceptance: a social perspective. Internet Research, 21(5), 562–580. Limayem, M., & Hirt, S. G. (2003). Force of habit information systems usage: theory and initial validation. Journal of the Association for Information Systems, 4, 65–97. Limayem, M., Hirt, S. G., & Cheung, C. M. K. (2007). How habits limits the predictive power of intention: the case of information systems continuance. MIS Quarterly, 31(4), 705–737. Lin, H. F. (2009). Examination of cognitive absorption influencing the intention to use a virtual community. Behaviour & Information Technology, 28(5), 421–431. Lin, C. P., & Anol, B. (2008). Learning online social support: an investigation of network information technology based on UTAUT. Cyberpsychology & Behaviour, 11(3), 268–272. Lin, K. Y., & Lu, H. P. (2011). Why people use social networking sites: An empirical study integrating network externalities and motivations. Computers in Human Behavior, 27(3), 1152–1161. Lucas, H. C. (1973). A Descriptive Model of Information Systems in the Context of the Organization. Data Base, 5, 27–39. Magni, M., Susan Taylor, M., & Venkatesh, V. (2010). ‘To play or not to play’: A crosstemporal investigation using hedonic and instrumental perspectives to explain user intentions to explore a technology. International Journal of Human– Computer Studies, 68(9), 572–588. Maldonado, U. P. T., Khan, G. F., Moon, J., & Rho, J. J. (2011). E-learning motivation and educational portal acceptance in developing countries. Online Information Review, 35(1), 66–85.
2787
Moon, J. W., & Kim, Y. G. (2001). Extending the TAM for a World-Wide-Web context. Information & Management, 38, 217–230. Nunnally, J. C. (1967). Psychometric theory. New York, NY: McGraw-Hill. Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory (3rd ed.). New York: McGraw-Hill. Pai, J. C., & Tu, F. M. (2011). The acceptance and use of customer relationship management (CRM) systems: An empirical study of distribution service industry in Taiwan. Expert Systems with Applications, 38(1), 579–584. Perez, J. C. (2011). User satisfaction study: Facebook vulnerable to Google+. . Pine, B. J., Peppers, D., & Rogers, M. (1995). Do you want to keep your customer forever? Harvard Business Review, 103–114. Pynoo, B., Devolder, P., Tondeur, J., Van Braak, J., Duyck, W., & Duyck, P. (2011). Predicting secondary school teachers’ acceptance and use of a digital learning environment: A cross-sectional study. Computers in Human Behavior, 27(1), 568–575. Ramayah, T., Rouibah, K., Gopi, M., & Rangel, G. J. (2009). A decomposed theory of reasoned action to explain intention to use Internet stock trading among Malaysian investors. Computers in Human Behavior, 25(6), 1222–1230. Ringle, C. M., Wende, S., & Will, A. (2005). SmartPLS 2.0 (beta), . Rosen, P. A., & Kluemper, D. H. (2008). The impact of the big five personality traits on the acceptance of social networking website. In Proceedings of AMCIS 2008. Sapio, B., Turk, T., Cornacchia, M., Papa, F., Nicolò, E., & Livi, S. (2010). Building scenarios of digital television adoption: a pilot study. Technology Analysis & Strategic Management, 22(1), 43–63. Sledgianowski, D., & Kulviwat, S. (2009). Using social networking sites: The effect of playfulness, critical mass and trust in hedonic context. Journal of Computer Information Systems, 74–83. Straub, D. W. (1989). Validating instruments in MIS research. MIS Quarterly, 13(2), 147–169. Straub, D., Limayem, M., & Karahanna, E. (1995). Measuring system usage: Implication for IS theory testing. Management Science, 41(8), 1328–1339. Straub, D., Boudreau, M. C., & Gefen, D. (2004). Validation guidelines for is positivist research. Communications of the Association for Information Systems, 13(1), 380–427. Šumak, B., Polancˇicˇ, G., & Hericˇko, M. (2010). An Empirical Study of Virtual Learning Environment Adoption Using UTAUT. In 2010 Second international conference on mobile, hybrid, and on-line learning (pp. 17–22). IEEE. Sun, J., & Teng, J. T. (2012). Information systems use: Construct conceptualization and scale development. Computers in Human Behavior, 28(5), 1564–1574. Sweeney, J. C., & Soutar, G. N. (2001). Consumer perceived value: The development of a multiple item scale. Journal of Retailing, 77(2), 203–220. Urbach, N., & Ahlemann, F. (2010). Structural equation modeling in information systems research using partial least squares. Journal of Information Technology Theory and Application (JITTA), 11(2), 5–40. Van der Heijden, H. (2004). User acceptance of hedonic information systems. MIS Quarterly, 28(4), 695–704. Van Raaij, E. M., & Schepers, J. J. (2008). The acceptance and use of a virtual learning environment in China. Computers & Education, 50(3), 838–852. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425–478. Venkatesh, V., Brown, S. A., Maruping, L. M., & Bala, H. (2008). Predicting different conceptualizations of system use: The competing roles of behavioral intention, facilitating conditions, and behavioral expectation. MIS Quarterly, 32(3), 483–502. Venkatesh, V., & Bala, H. (2008). Technology acceptance model 3 and a research agenda on interventions. Decision Sciences, 39(2), 273–315. Venkatesh, V., Thong, J. Y. L., & Xu, X. (2012). Consumer acceptance of information technology: extending the unified theory of acceptance and use of technology. MIS Quarterly, 36(1), 157–178. Wang, Y. S., & Shih, Y. W. (2009). Why do people use information kiosks? A validation of the unified theory of acceptance and use of technology. Government Information Quarterly, 26(1), 158–165. Webster, J., & Martocchio, J. (1992). Microcomputer playfulness: Development of a measure with workplace implications. MIS Quarterly, 16(1), 201–226. Wu, I. L., & Wu, K. W. (2005). A hybrid technology acceptance approach for exploring e-CRM adoption in organizations. Behaviour & Information Technology, 24(4), 303–316. Yang, K. (2010). Determinants of US consumer mobile shopping services adoption: implications for designing mobile shopping services. Journal of Consumer Marketing, 27(3), 262–270. Zhang, P., Li, N., & Sun, H. (2006). Affective quality and cognitive absorption: Extending technology acceptance research. In Proceedings of the Hawaii International Conference on System Sciences. Zhou, T., Lu, Y., & Wang, B. (2010). Integrating TTF and UTAUT to explain mobile banking user adoption. Computers in Human Behavior, 26(4), 760–767.