The Incremental Contribution of Complex Problem-Solving Skills to the Prediction of Job Level, Job Complexity, and Salary Jakob Mainert, Christoph Niepel, Kevin R. Murphy & Samuel Greiff
Journal of Business and Psychology ISSN 0889-3268 J Bus Psychol DOI 10.1007/s10869-018-9561-x
1 23
Your article is protected by copyright and all rights are held exclusively by Springer Science+Business Media, LLC, part of Springer Nature. This e-offprint is for personal use only and shall not be self-archived in electronic repositories. If you wish to selfarchive your article, please use the accepted manuscript version for posting on your own website. You may further deposit the accepted manuscript version in any repository, provided it is only made publicly available 12 months after official publication or later and provided acknowledgement is given to the original source of publication and a link is inserted to the published article on Springer's website. The link must be accompanied by the following text: "The final publication is available at link.springer.com”.
1 23
Author's personal copy Journal of Business and Psychology https://doi.org/10.1007/s10869-018-9561-x
ORIGINAL PAPER
The Incremental Contribution of Complex Problem-Solving Skills to the Prediction of Job Level, Job Complexity, and Salary Jakob Mainert 1 & Christoph Niepel 1 & Kevin R. Murphy 2
&
Samuel Greiff 1
# Springer Science+Business Media, LLC, part of Springer Nature 2018
Abstract As work life becomes increasingly complex, higher order thinking skills, such as complex problem-solving skills (CPS), are becoming critical for occupational success. It has been shown that individuals gravitate toward jobs and occupations that are commensurate with their level of general mental ability (GMA). On the basis of the theory of occupational gravitation, CPS theory, and previous empirical findings on the role of CPS in educational contexts, we examined whether CPS would make an incremental contribution to occupational success after controlling for GMA and education. Administering computerized tests and self-reports in a multinational sample of 671 employees and analyzing the data with structural equation modeling, we found that CPS incrementally explained 7% and 3% of the variance in job complexity and salary, respectively, beyond both GMA and education. We found that CPS offered no incremental increase in predicting job level. CPS appears to be linked to job complexity and salary in a range of occupations, and this link cannot be explained as an artifact of GMA and education. Thus, CPS incrementally predicts success, potentially contributes to the theory of job gravitation, and adds to the understanding of complex cognition in the workplace. Keywords Complex problem-solving . General mental ability . Occupational gravitation . Job complexity
There is extensive evidence that individuals gravitate toward jobs and occupations that are commensurate with their general cognitive ability (also referred to as general mental ability; GMA; Desmarais & Sackett, 1993; Gottfredson, 1986; Jensen, 1980; Sorjonen, Hemmingsson, Deary, & Melin, 2015; Wilk, Desmarais, & Sackett, 1995). For example,
* Jakob Mainert
[email protected] Christoph Niepel
[email protected] Kevin R. Murphy
[email protected] Samuel Greiff
[email protected] 1
University of Luxembourg, Institute of Cognitive Science and Assessment, Esch-sur-Alzette, Luxembourg
2
University of Limerick, Limerick, Ireland
studies of intellectually gifted youth have provided compelling evidence for this sorting process. In particular, data on the later careers of children with very high scores on measures of general and specific cognitive abilities have shown that these individuals are vastly over-represented in cognitively demanding careers (e.g., academics, learned professions, business leadership; Kell, Lubinski, & Benbow, 2013; Lubinski, Benbow, & Kell, 2014; Makel, Kell, Lubinski, Putallaz, & Benbow, 2016). These studies have also offered evidence of ability tilt, the idea that gifted students not only sort themselves into cognitively demanding occupations but they also tend to be found in occupations that fit their unique strengths and weaknesses (e.g., students with quantitative abilities that exceed their verbal abilities tend to gravitate toward quantitatively oriented careers). Studies in employment settings have provided similar evidence of occupational sorting. Jobs and occupations can be reliably scaled at the level of incumbents’ GMA, as a result of their tendency to sort themselves (and be sorted by others) into jobs with cognitive demands that are in line with their ability levels (Gottfredson, 1986; Wonderlic, 2002). As a result, individuals with higher levels of cognitive ability are more likely than individuals with lower ability levels to fill and succeed
Author's personal copy J Bus Psychol
in jobs that involve more demanding, intense, or frequent information processing (McCormick, DeNisi, & Shaw, 1979; McCormick, Jeanneret, & Mecham, 1972; Murphy, 1989; Wilk et al., 1995). The mental demands of jobs are consistently correlated with the general cognitive ability levels of job incumbents (Jensen, 1980; Murphy, 1989), and successfully holding and ascending in cognitively demanding jobs is associated with higher cognitive ability levels (Judge, Klinger, & Simon, 2010), even after retirement (Smart, Gow, & Deary, 2014).
The Gravitation Hypothesis According to Wilk et al. (1995), the B…gravitation hypothesis posits that individuals, over the course of their labor market experiences, will sort themselves into jobs that are compatible with their interests, values and abilities…^ (p. 79). For example, jobs can be classified according to their cognitive requirements, and the gravitation hypothesis suggests that B…a poor match between an individual’s abilities and the complexity of his or her job may propel the individual to seek out, or gravitate toward, a better match^ (p. 79). Tests of gravitation hypotheses have applied a range of methods to evaluate the hypothesis that individuals become sorted into jobs that are consistent with their cognitive ability levels (Wilk et al., 1995; Wilk & Sackett, 1996) and to examine the role of cognitive ability in shaping income or occupational trajectories over time (Sorjonen et al., 2015). The original studies of the gravitation hypothesis were almost exclusively cross-sectional (e.g., McCormick et al., 1972, 1979), but after gravitation in terms of mental ability had been established, longitudinal studies began to emerge (e.g., Wilk & Sackett, 1996). That is, early studies in this area asked the question of whether individuals cluster in jobs that are consistent with their ability levels, and only after this clustering was established did this literature take up the question of how, over time, individuals sort themselves or are sorted into jobs commensurate with their cognitive ability levels. We follow a similar strategy here by first asking whether people are clustered into jobs on the basis of their complex problem-solving skills, independent of their levels of education or GMA. Evidence of this type of clustering may provide an impetus for future longitudinal studies to examine how such clustering unfolds over time. To date, most studies on occupational gravitation have focused on the role of GMA rather than specific abilities and skills. In many ways, this is understandable; the effects of general cognitive ability on job performance and job success are so strong and consistent that some researchers (e.g., Ree, Caretta, & Teachout, 2015; Ree, Earles, & Teachout, 1994) have argued that specific abilities offer very little incremental information as predictors of these criteria. Whereas g-centric approaches to understanding cognitive ability have
considerable practical value (Murphy, 1996; Ree et al., 2015), they have been criticized. Exclusively g-centric approaches have the potential to blind researchers and practitioners to the multidimensional nature of cognitive ability (Reeve, Scherbaum, & Goldstein, 2015; Schneider & Newman, 2015) and to limit the understanding of how people translate their abilities into effective performance on the job. Several recent papers (e.g., Judge & Kammeyer-Mueller, 2012; Lievens & Reeve, 2012; Reeve et al., 2015; Schneider & Newman, 2015) have argued that there are advantages to considering measures of specific constructs as predictors and as explanatory mechanisms, even in contexts where more general constructs account for a substantial amount of variance in important criteria or where the correlations between different abilities (e.g., GMA vs. Gfluid) are high. One promising candidate for a specific construct with predictive and explanatory value is complex problem-solving (CPS) skills. As we note below, CPS skills have been identified by the National Research Council and the Organization for Economic Cooperation and Development (OECD) as uniquely and critically important skills in the twenty-first century workplace. In this study, we consider the hypothesis that individuals can be sorted into occupations on the basis of their levels of skill in solving complex problems. Throughout this paper, we will use the term occupational sorting to refer to the process by which individuals are sorted into occupations. We are not using sorting to refer simply to preferences for different occupations; a number of factors including preferences, opportunities, skills, values, and social norms are likely to contribute to the process by which individuals are sorted into jobs and occupations, but we assume that individuals have at least some agency in determining what jobs and occupations they enter or remain in and that their choices represent an important part of explaining the occupations they pursue. In the BDiscussion^ section of this paper, we return to the question of how the different factors by which individuals are sorted into occupations might play out over time. Research on occupational gravitation has presented convincing evidence that people are sorted into occupations on the basis of GMA; here, we ask whether a comparable sorting process exists at a more specific level, that is, at the level of CPS. As Reeve et al. (2015) and Gustafsson (2002) noted, in testing hypotheses about the role of specific abilities, it is important to control for GMA. In this study, we therefore ask whether individuals’ CPS skill levels predict the types of occupations they hold and the levels of success they achieve in these occupations, independent of their GMA levels.
The Nature of CPS Skills CPS is defined as B… the successful interaction with task environments that are dynamic (i.e., change as a function of
Author's personal copy J Bus Psychol
user’s intervention and/or as a function of time) and in which some, if not all, of the environment’s regularities can only be revealed by successful exploration and integration of the information gained in that process^ (Buchner, cited in Frensch & Funke, 1995, p. 14). CPS involves both the acquisition and the application of new knowledge in situations that must be actively explored to find and apply a solution (Greiff, Holt, & Funke, 2013b; Funke, 2010; Gonzalez, Vanyukov, & Martin, 2005; Osman, 2010). CPS has been identified by leading scientific and policy agencies as critical in the workplace of the twenty-first century (National Research Council, 2012; OECD, 2013a, b). Given an increase in the number of jobs that require employees to solve complex problems in real time and a corresponding decrease in the number of jobs that involve executing well-defined organizational practices and routines, CPS is likely to grow in importance (Autor, Levy, & Murnane, 2003; Goos, Manning, & Salomons, 2009; Middleton, 2002; Neubert, Mainert, Kretzschmar, & Greiff, 2015). Although not formally a part of the Carroll-HornCattell (CHC) model of cognitive ability (Carroll, 1993; McGrew, 2005, 2009), CPS skills clearly draw upon secondstratum abilities in this hierarchical model, notably fluid reasoning (Gf), which McGrew (2009) defined as Bthe use of deliberate and controlled mental operations to solve novel problems that cannot be performed automatically…^ (p. 5). Computerized Tests of CPS Skills The computerized measures of CPS that we introduce in the BMethod^ section have been successfully included in what is arguably the most important large-scale assessment worldwide, the OECD’s Programme for International Student Assessment (PISA). Starting in 2012, PISA administered computerized tests of CPS along with the traditional PISA domains of mathematics, science, and reading to tens of thousands of 15-year-old students in 44 countries and economies worldwide (OECD, 2012, 2014). The CPS measures included in PISA 2012 are strongly correlated with performance in mathematics (r = .81), science (r = .78), and reading (r = .75; OECD, 2014), and the OECD (2013a, b) has emphasized the importance of CPS skills for lifelong learning and for meeting shifting workplace demands in the job market. The results of the CPS tests in PISA have been instrumental for the creation of policies in higher education that targeted patterns of strengths and weaknesses in problem-solving. For instance, 8% of 15-year-olds in OECD countries were allocated to the lowest of seven CPS levels (i.e., they were able to solve only the straightforward problems, applying a simple trial-and-error strategy). The majority (approximately 57%) reached level 3. They demonstrated the ability to plan ahead, monitor progress, and test options. Only 2.5% reached the highest level and were able to modify their strategies, take into account various constraints, and execute flexible plans in multiple steps (OECD, 2014). The PISA results help national
educational systems aim resources where they can be used to help close gaps between the skills students exhibit and the skills they will need to succeed in the workplace. Is CPS Distinct from GMA? CPS skills involve cognitive processes that are linked to general cognitive abilities, such as fluid reasoning (McGrew, 2009) and working memory capacity (e.g., Ackerman, Beier, & Boyle, 2005), and measures of CPS are often strongly correlated with measures of general cognitive ability. For example, a recent meta-analysis (Stadler, Becker, Gödker, Leutner, & Greiff, 2015) suggested that the most widely used measures of CPS are highly correlated with measures of GMA (uncorrected mean r = .585, 95% CI [.510, .652]), with even higher correlations with measures of fluid reasoning. Nevertheless, several studies have concluded that CPS is both theoretically and empirically distinct from these more general cognitive abilities (e.g., Greiff, Wüstenberg, & Funke, 2012; Sonnleitner, Keller, Martin, & Brunner, 2013). There are two important reasons to believe that CPS skills represent a distinct set of constructs that is meaningfully different from GMA and from second-stratum abilities such as fluid reasoning. First, unlike general and second-stratum cognitive abilities, skills such as CPS can be modified through modest amounts of practice and training (Weinert, 2001). There is considerable evidence that GMA tends to be highly consistent over time and relatively resistant to change through the age ranges where individuals are most likely to be employed (Hertzog & Schaie, 1988; Jensen, 1980). For example, Rönnlund, Sundström, and Nilsson (2015) showed that individual differences in general cognitive ability assessed at age 18 are virtually unchanged in subsequent assessments at age 50. To be sure, this does not mean that cognitive ability is a fixed quantity. For example, there is evidence that some multiyear intervention programs (especially well-structured early childhood interventions) can lead to improvements of up to half a standard deviation in scores on GMA tests (Barnett, 1995; see, however, Haier, 2014, who raised questions about the meaning of changes in these scores, given the crudity of the scales typically used to assess mental ability). However, attempts to substantially change levels of GMA with relatively short-term interventions have not been very successful (Jensen, 1980; Sala & Gobet, 2017). By contrast, CPS skills can be developed over relatively short periods of time with appropriate instruction, practice, and feedback (Bakken, 1993; Jensen, 2005; Kretzschmar & Süß, 2015; Tomic, 1995; Weinert, 2001). There is also evidence that CPS skills improve through workplace experience (Zaccaro et al., 2015), in particular through experience in solving complex problems in the workplace (Ohlott, 2004). Furthermore, measures of CPS are consistently correlated with assessments of experience with complex tasks in the
Author's personal copy J Bus Psychol
workplace and in training (r = .40, p < .01; Zaccaro et al., 2015). Second, several studies have demonstrated the discriminant validity of CPS measures used in this paper and in PISA 2012. Specifically, these studies have shown that there are aspects of CPS that are completely independent of GMA and that these independent CPS aspects predict performance and success in educational settings (Greiff, Fischer, Sonnleitner, Brunner, & Martin, 2013a; Sonnleitner et al., 2013; Schweizer, Wüstenberg, & Greiff, 2013; Stadler, Becker, Greiff, & Spinath, 2015; Wüstenberg, Greiff, & Funke, 2012). A few recent studies have suggested that the same measures of CPS may also have incremental value in predicting job performance and occupational success. For example, Ederer, Nedelkoska, Patt, and Castellazzi (2015) estimated the market value of CPS in an econometric human capital model and found that CPS was a significant predictor in Mincer-style wage regressions when added to a model that included GMA and work experience.1 In their study, measures of CPS contributed significantly to the prediction of salary level, even when GMA and work experience were also considered. Danner, Hagemann, Schankin, Hager, and Funke (2011) showed that measures of dynamic decision-making (which are similar to widely used CPS measures) predicted both self-reports of occupational success and supervisory assessments of performance, even after GMA was held constant. Finally, CPS can be differentiated from more general abilities, such as fluid reasoning, by its two-step process of knowledge acquisition and knowledge application. As Funke (2001, 2010) noted, CPS begins with the acquisition of information in a context where it might not be clear precisely what information is required. CPS requires not only the acquisition of information in a complex and unstructured environment but also the application of that knowledge to solve concrete problems (Gonzalez et al., 2005; Greiff, Holt, & Funke, 2013b; Funke, 2010; Osman, 2010). The CPS measures used in this paper reflect this two-step acquisition-application process.
Do CPS Skills Play a Distinct Role in Occupational Sorting and Success? One way to demonstrate that CPS is different from GMA is to show that CPS explains incremental variance in important criteria (e.g., occupational sorting). There are several reasons to believe that CPS skills play a role in determining the types of occupations individuals are drawn to, remain in, and 1 Ederer et al.’s (2015) study served as a starting point for the current study and used a subsample (n = 399) with less than 50% overlap with the sample used here. More important, the current study includes two dependent variables, occupational level and job complexity, not examined by Ederer et al. (2015) and frames the analysis in terms of a multivariate model with multiple dependent variables, in contrast to the univariate wage regression model in Ederer et al. (2015).
succeed in and that this role is distinct and incremental. These reasons can be grouped under three headings: (a) changing jobs, (b) evidence for related skills in the workplace, and (c) a distinct and incremental role of CPS in other settings. Changing Jobs Jobs in the modern economy are becoming increasingly complex (Hoffman, 2016; Scherbaum, Goldstein, Yusko, Ryan, & Hanges, 2012), and this complexity involves increased demands for cognitive skills that are related to but not fully subsumed under general cognitive ability. Both the National Research Council (2012) and the OECD (2013a, b) have identified CPS skills as critical for success in the tasks workers need to perform when carrying out their duties. As continuous technological and organizational changes create an increasingly dynamic, nonroutine, and interactive workplace, jobs are increasingly likely to demand higher order thinking skills to plan, actively explore, execute, and monitor associated tasks (Autor et al., 2003; Becker et al., 2013; Cascio, 1995; Goos et al., 2009; Middleton, 2002). Jobs are changing in terms of the equipment and procedures they require (see Autor et al., 2003; Cascio, 1995) as well as the organizational structures and teams that support these jobs (see also Middleton, 2002). At the same time, there is an increase in the offshoring of routine tasks (e.g., Autor, Katz, & Kearney, 2006; Goos et al., 2009), leading to a growing level of complexity in the jobs held by many employees in developed economies. As jobs become more complex, employees will need higher order thinking skills including CPS, creativity, and information and communication technology literacy to perform well on the job (Binkley et al., 2012). For a vast number of tasks found in most jobs, GMA is critically important (Funke, 2010; Raven, 2000; Wüstenberg et al., 2012), but particularly complex tasks are likely to require higher order thinking skills (National Research Council, 2012) that are neither fully subsumed under GMA (Funke, 2010) nor provided by higher education (Rocha, 2012). The Relevance of Related Skills in the Workplace There is evidence that skills that appear to be related to modern conceptions of CPS are related to performance and success in a range of occupations. For example, assessment centers and high-fidelity simulations often include exercises that require participants to solve complex and challenging problems (Lievens & Patterson, 2011; Thornton III & Rupp, 2006). Indeed, the precursors of modern assessment centers, developed to assess candidates for the Office of Strategic Services (OSS) during World War II often required candidates to work in groups to solve complex and difficult problems. Sometimes noncooperative confederates or other impediments were even inserted to prevent candidates from achieving solutions to these problems (Wiggins, 1973).
Author's personal copy J Bus Psychol
Second, there is evidence that CPS skills play an important role in leadership success and continuance (Zaccaro et al., 2015). Leaders provide structure and support for their subordinates, and it is possible that CPS skills help them to acquire and apply the knowledge that individuals and teams need to solve complex problems at work. Potential Incremental Contributions of CPS Several studies have shown that CPS skills make an incremental contribution, beyond general cognitive ability, in predicting school achievement (e.g., Greiff et al., 2013a; Schweizer et al., 2013) and success at a university (Stadler, Becker, Gödker, et al., 2015). There is some evidence that CPS skills make an incremental contribution to predicting workplace performance (e.g., Danner et al., 2011; Ederer et al., 2015; Neubert et al., 2015), but workplace studies are presently too few in number and incomplete in terms of what criteria they cover to allow researchers to draw firm conclusions. The present study is designed to help fill the gap in research on the role of CPS in the workplace.
Is There Room for an Incremental Contribution of CPS in the Workplace? Across many decades, industrial-organizational (I-O) psychology has unequivocally reported that specific cognitive skills and abilities have incremental value, above and beyond GMA, in predicting success on the job. Given the extensive body of evidence that general cognitive ability predicts performance across most jobs (for an overview, see Ng, Eby, Sorensen, & Feldman, 2005; Salgado et al., 2003; Schmidt & Hunter, 1998, 2004), some I-O researchers have suggested that not much more than GMA is needed to predict success on the job (e.g., Ree et al., 2015; Ree et al., 1994; Schmidt & Hunter, 1998, 2004). Existing research on occupational gravitation has largely ignored specific cognitive skills, but it is very possible that similar outcomes will be observed in this research area. That is, the incremental contribution of CPS in predicting measures of occupational sorting and success could be similarly limited, especially given the conceptual and empirical overlap between CPS and GMA (e.g., Funke, 2010; Wüstenberg et al., 2012). Given that both the National Research Council (2012) and the OECD (2013b) currently recommend policy and decisionmakers to invest substantially in CPS skills and their development, it is important to ask whether this skill set actually makes a distinct contribution to occupational sorting and success. Because CPS skills are so strongly linked to GMA, it is possible that the National Research Council and the OECD are making ill-founded recommendations and that it is GMA and not CPS skills that are required to succeed in the changing world of work. In this study, we test the hypothesis that CPS makes a distinct and incremental contribution, over and above
GMA, in predicting occupational sorting and success. Support for this hypothesis would provide support for the frequent recommendation that this specific set of skills should be developed, regardless of a person’s GMA. In addition to GMA, education is a well-established empirical predictor of occupational sorting and success (Converse, Piccone, & Tocci, 2014; Ng et al., 2005; Sorjonen et al., 2015), which is in turn strongly predicted by GMA (e.g., Gottfredson, 2002). More educated individuals fill more managerial positions (Tharenou Latimer, & Conroy, 1994), they receive more promotions (Sheridan Slocum, & Buda, 1997), and their salaries rise faster (Bretz & Judge, 1994). Correspondingly, reaching higher educational levels is associated with greater success on the job (e.g., Converse et al., 2014; Ng et al., 2005). Education can be used to both provide knowledge and build the career-relevant skills and strategies needed to solve complex problems on the job (Molnár, Greiff, & Csapó, 2013), and it is possible that CPS measures could simply serve as proxies for educational level. Given that CPS improves with years of education (Molnár et al., 2013) and that education itself is related to occupational sorting and success (e.g., Converse et al., 2014) and is widely used as a costefficient proxy for career-relevant skills and abilities (Ng & Feldman, 2010), it is important to demonstrate that CPS makes an incremental contribution to occupational sorting and success over and above not only GMA but also educational level. Importance of the Incremental Contribution of CPS In this study, we test the hypotheses that individuals’ CPS skills, independent of their level of education and GMA, will predict the complexity, level, and salary of the jobs they are currently employed in. Empirical tests of these hypotheses are important for several reasons. First, both the National Research Council and the OECD have recommended substantial investment in developing CPS skills. If these skills are so tightly bound to GMA that they do not make a difference once GMA is considered, this recommendation might not be a sound one. Second, these hypotheses have important implications for recruitment, selection, and placement. If CPS skills do have incremental value in identifying which occupations individuals are likely to be drawn to and to succeed in, then including measures of these skills in vocational counseling, personnel selection, and placement is likely to improve the quality of individual and organizational decisions.
The Present Study We were able to obtain data in large multinational samples to link CPS with the type of job and occupation incumbents are currently employed in and with measures of success in these jobs and occupations. In this study, we tested two hypotheses. First, as described above, we predicted that CPS levels would
Author's personal copy J Bus Psychol
make an incremental contribution to predicting the likelihood that individuals would be employed in jobs that are more complex (i.e., jobs that require dealing with nonroutine and unpredictable demands). Because CPS skills have been found to be correlated with both GMA and educational level (Funke, 2010; Molnár et al., 2013; Schweizer et al., 2013), we controlled for both variables and hypothesized: H1: CPS will predict measures of the complexity of examinees’ jobs beyond what can be predicted on the basis of GMA and level of education. Next, we predicted that CPS would make an incremental contribution to predicting the likelihood that individuals would achieve job success, measured with respect to both job level and salary. Job level is one indicator of job success (International Labour Office, 2012), and the higher the job level, the higher the salary; thus, salary is often used as a monetary indicator of success (Jaskolka, Beyer, & Trice, 1985). The skills required by jobs represent a key determinant of individuals’ job level and salary (Guthrie, Dumay, Massingham, & Tam, 2015; Jensen, 1980; Milkovich, Newman, & Gerhart, 2013; Murphy, 1989). If CPS skills are indeed required to succeed on the job, they ought to make a unique contribution to the prediction of occupational success (e.g., job level and salary). Thus, we hypothesized: H2: CPS will predict measures of examinees’ job level and salary beyond what can be predicted on the basis of GMA and level of education. As we note later, salary is related to job level; jobs at the top of the hierarchical classification used in this study typically pay better than jobs at lower levels. Therefore, we also report analyses that link salary with CPS skills when job level is controlled for.
Method To test our hypotheses, we obtained measures of CPS skills using multiple tests modeled in the highly successful PISA assessments (OECD, 2012, 2014).2 We also obtained estimates of GMA using a well-validated, widely respected measure of fluid reasoning, Raven’s Standard Progressive Matrices (SPM). Finally, we used multiple items from the Federal Institute for Vocational Education and Training Survey (BIBB; Rohrbach-Schmidt & Hall, 2013) to classify incumbents’ jobs in terms of their complexity, and we developed and tested structural models to test our hypotheses. 2 Several of the authors of the current paper were part of the team that developed the computerized CPS assessments used in PISA.
Sample and Procedure The data reported here were obtained as part of a larger project (LLLight in Europe, 2015a) that involved both standardized on-site and unproctored online assessments of employees, students, and entrepreneurs at companies, social enterprise community centers, and universities in 13 different countries in Africa, Europe, and South America. The complete LLLight in Europe data set (N = 1167) has been used in EU policy (LLLight in Europe, 2015a, b, c) projects and in a recent article examining the relation between CPS and salary (Ederer et al., 2015). The current study involved a subsample of the LLLight in Europe study, specifically data from 671 EU-based employees who were assessed on site during company visits in the seven participating European Union (EU) countries (Denmark, France, Germany, the Netherlands, Slovakia, Spain, and the UK) and Switzerland.3 This sample (MAge = 37.06 years, SD = 12.46; 32.9% women) included responses from incumbents across various job and educational levels of 21 organizations in IT, engineering, health care, research, agriculture, and customer service across Europe. Frequencies of country origin, educational level, and job level are displayed in Table 1. The current study differs from the broader LLLight in Europe (2015a) study with respect to our objectives and the variables we examined. In particular, previous studies using LLLight in Europe (2015a) data did not examine the incremental contribution of CPS over GMA and education in a straightforward way. Nevertheless, in the parts of the current study where we report results that were based on data that had been analyzed before (i.e., in Ederer et al., 2015; LLLight in Europe, 2015a, b, c), we explicitly note this overlap. The European subsample used in this study was economically, geographically, and socially more homogenous than the full global LLLight in Europe sample of 1167 employees, entrepreneurs, and students, distributed across three continents. Further, in this study, trained test administrators collected all CPS data during company visits, whereas the larger LLLight in Europe project relied on a mixture of on-site and online assessments. In our subsample, 671 companyemployed participants completed computerized test batteries of cognitive abilities and skills as well as self-reported measures of job characteristics and salaries. Participants received the test batteries in their native language. Testing took approximately 100–145 min, depending on slightly different test batteries used in different organizations, and testing included one 10–15-min break. 3 We collected data from 676 respondents but dropped five employees who reported unusually high or low wages relative to the median (with a cut-off of more than 2.5 times the Median Absolute Deviation [MAD; Hampel, 1974] around the median; as recommended by Leys, Ley, Klein, Bernard, & Licata, 2013). As a result, our sample included N = 671 working individuals.
Author's personal copy J Bus Psychol Table 1 Full sample characteristics
Country
N (%)
Educational levela
N (%)
Job levelb
N (%)
Denmark Germany France
20 (3.0) 424 (63.2) 10 (1.5)
Primary Lower secondary Upper secondary
8 (1.3) 134 (20.0) 67 (9.8)
Lower levelc Clerical Technician
85 (12.7) 83 (12.4)
Holland Slovakia
1 (0.1) 40 (6.0)
Postsecondary Bachelors or equivalent
93 (13.9) 91 (13.6)
Professional Manager
241 (35.9) 70 (10.4)
Spain
152 (22.7)
Masters or equivalent
254 (37.9)
Missing
34 (5.0)
Switzerland
12 (1.8)
Doctorate or equivalent
24 (3.6)
UK
12 (1.8)
158 (23.6)
N = 671 a
For educational level, an adjusted version of the international standard classification for education (ISCED) was used
b
For job level, an adjusted version of the international standard classification for occupations of 2008 (ISCO-08) was used
c
Lower level jobs included service and sales workers, skilled agricultural workers, workers in crafts and related trades, plant and machine operators and assemblers, and elementary, low-skill occupations
The primary purpose of the LLLight in Europe (2015a) study was to collect data on computerized measures of CPS. When time and resources permitted, we collected additional data on a number of variables (e.g., GMA, job characteristics, salary). Some of the organizations that participated in this study were not able to grant the time needed to administer all of our measures. Our GMA measure (Raven’s SPM) and the job complexity scale took the longest to assess and were therefore usually the first measures the organizations left out when they needed to save testing time. In this study, we were forced to employ a research design in which important variables would not be available for all participants. Missing data were largely the result of practical constraints at the company level (see above); there were virtually no missing data that resulted from the failure of individual respondents to complete a measure that had been administered to them. Because GMA is central to this research, latent regression analyses and other analyses that included Raven’s SPM as a predictor and a control variable were based on a subsample of n = 394 individuals who completed this measure (MAge = 34.88 years, SD = 11.91; 29.9% women; Table 2).
Measuring CPS We employed two established computer-based CPS performance tests (MicroDYN, Greiff et al., 2012; MicroFIN, Neubert, Kretzschmar, Wüstenberg, & Greiff, 2014), both modeled on assessments included in PISA and both administered on tablets. MicroDYN and MicroFIN have already been shown to reliably and validly measure CPS in predicting success in a variety of educational settings (e.g., Greiff, Fischer, et al., 2013a). Both measure CPS in a number of short problem tasks (approximately 5 min per task; 14 tasks in total in the current study). Both assessments were presented in
respondents’ native language; back-translation was used to ensure the accuracy of these translations. MicroDYN Participants were required to acquire and apply new knowledge in these CPS tasks (e.g., Funke, 2010). MicroDYN tasks are designed within the framework of linear structural equations (LSE; Funke, 2010). In LSE, examinees use slide controls to interact with inputs that influence outputs in dynamic, nonroutine, and interactive ways (Funke, 2010). For instance, several technical components (e.g., X, Y, and Z) might gradually influence the noise and maintenance costs in the exemplary MicroDYN task Wind Power Station (see Fig. 1). As the relations between inputs and outputs in the Wind Power Station are not apparent when the task begins, examinees need to actively engage with the inputs to control the task (e.g., if the examinee increases slide control X, the Wind Power Station’s noise can be lowered, but this relation becomes apparent only when the participant interacts with the program). The abstract labeling of the inputs, such as X, Y, and Z, eliminates effects of expert knowledge (i.e., even wind power engineers should not have an advantage in solving this problem with arbitrarily labeled inputs of X, Y, and Z). In the knowledge acquisition phase, examinees are first requested to freely explore the task, learn relations, and draw connections between inputs and outputs (e.g., between X and noise) for 3 min. Then, in the knowledge application phase, they are asked to reach defined performance thresholds (e.g., lowering noise and costs) within a time frame of 1.5 min. The task contents range, for example, from handball coaching, to the logistics of transporting goods by road, to the illustrated Wind Power Station (for more details on MicroDYN, see Greiff et al., 2012). After excluding the first task, which served as an introduction and practice, the six remaining tasks were scored with
Author's personal copy J Bus Psychol Table 2 Sample characteristics of the subset of examinees (n = 394) completing Raven’s SPM
Country
N (%)
Educational levela
N (%)
Job levelb
N (%)
Denmark Germany France
0 (.0) 228 (57.9) 0 (.0)
Primary Lower secondary Upper secondary
3 (.8) 90 (22.8) 46 (11.7)
Lower levelc Clerical Technician
40 (10.2) 33 (8.4)
Holland Slovakia
1 (.3) 33 (8.4)
Postsecondary Bachelors or equivalent
46 (11.7) 41 (10.4)
Professional Manager
153 (38.8) 43 (10.9)
Spain
121 (30.7)
Masters or equivalent
155 (39.3)
Missing
13 (3.3)
Switzerland
11 (2.8)
Doctorate or equivalent
13 (3.3)
UK
0 (.0)
112 (28.4)
n = 394 a
For educational level, an adjusted version of the international standard classification for education (ISCED) was used
b
For job level, an adjusted version of the international standard classification for occupations of 2008 (ISCO-08) was used
c
Lower level jobs included service and sales workers, skilled agricultural workers, workers in crafts and related trades, plant and machine operators and assemblers, and elementary, low-skill occupations
MicroFIN These tasks also consist of knowledge acquisition and knowledge application phases. MicroFIN tasks are
designed as so-called finite state automata (FSA; Funke, 2001). FSA are tasks with levers, switches, and buttons that change the system from one state to another in dynamic, nonroutine, and interactive ways (Funke, 2010). In FSA, examinees press buttons to transfer an undesired finite state in a system (i.e., an automaton, such as a smart phone) into a desired goal state. Formally, levers, switches, and buttons are a finite set of inputs that change the state of the automaton in qualitative ways. For instance, pressing a button in MicroFIN’s Plan-o-mat activates changes in the urban landscape that directly influence states of well-being for four urban interest groups (families, playgrounds, malls, and industry; see Fig. 2). The knowledge acquisition and application phases are conceptually similar to those used in MicroDYN. In
Fig. 1 The MicroDYN Wind Power Station task (Greiff et al., 2012). Left side: knowledge acquisition. X, Y, and Z influence noise and costs. Examinees are asked to draw their acquired knowledge about the relations in an onscreen causal diagram (Funke, 2001; see the bottom part , left
side). Right side: knowledge application (cf. Wüstenberg et al., 2012). Target values for each output variable (red areas and numbers in brackets) have to be met within a maximum of four steps to gain control over the system
respect to the two underlying phases of knowledge acquisition and knowledge application. In phase 1, knowledge acquisition, we gave full credit (1 point) for correctly drawing all connections between variables (e.g., between input X and the noise output in the Wind Power Station) and no credit (0 points) for incorrect connections. In phase 2, knowledge application, we gave full credit (1 point) for correctly controlling target values on all outputs regarding a requested goal state (e.g., downregulating noise and costs in the Wind Power Station to a target value) and no credit (0 points) for not reaching the requested goal state.
Author's personal copy J Bus Psychol Fig. 2 The MicroFIN item BPlano-mat^ (Neubert et al., 2014). Problem-solvers have to balance the interests of various parties in a city by making alterations in the urban landscape. The keys for altering the location of the interest groups are located along the bottom and the right side. In principle, two stakeholders change places when triggered. A city mall and a factory are situated on the right side, and a family home and a playground are situated on the left side. Between these parties, smiley faces are presented to indicate the atmosphere. The problem-solver has to acquire knowledge about how to change the atmosphere (knowledge acquisition) and has to find one of several optimal setups (knowledge application)
addition to the urban planning task Plan-o-mat, the typical MicroFIN tasks included in the current study required examinees to manage classical versus rock and roll music concerts that depend of the type of audience, ticket price, and venue (Concert-o-mat) or to harvest a new pumpkin species whose growth depends on the season and various fertilizers (Greeno-mat; for more details on the MicroFIN tasks, see Neubert et al., 2014). MicroFIN was scored similarly to MicroDYN.
consistently been shown to exhibit very high loadings on GMA factors (Carroll, 1993; Jensen, 1998; Raven et al., 1998; Ree et al., 2015). Gignac (2015) reviewed factoranalytic evidence that suggests that Raven’s SPM is probably a better measure of fluid reasoning than of GMA, but even his review documented the consistently strong g-loadings of the SPM.
Additional Measures
Table 3 Descriptive statistics for CPS, Raven’s SPM, salary, and job complexity
In addition to CPS, we measured GMA, educational level, job level, salary, and job complexity. Descriptive statistics for these variables are presented in Table 3.
Variable
Minimum
Maximum
M
SD
CPS Raven’s Salary (US dollars) Job complexity
281.41 340 7104 1
689.37 800 174,683.52 5
489.57 544.13 35,434.92 3.62
98.70 111.04 25,474.08 0.88
GMA Estimate In large-scale factor-analytic studies (e.g., Carroll, 1993), GMA is often estimated as a higher order factor that links measures of conceptually distinct abilities. For example, Ree et al. (1994) used the first principal component emerging from correlations among multiple cognitive tests (e.g., subtests in the Armed Services Vocational Aptitude Battery) to estimate GMA. In the current project, it was not possible to administer a large battery of tests to participants; therefore, we used a single test that is widely accepted as a reliable and valid marker of GMA. We used a computerized version of Raven’s SPM (Raven, Raven, & Court, 1998) that was administered on tablets as an indicator of GMA. Raven’s SPM is widely regarded as a very good measure of GMA, and scores on this test have
We calculated a CPS mean based on the average of all MicroDYN and MicroFIN scores for each participant. The mean and SD of CPS in the current study sample (N = 671) are presented in standardized scores based on the total sample (of N = 1126, with M = 500 and SD = 100; LLLight’in’Europe, 2015a, b, c). The mean and SD of Raven’s in the current study sample (n = 394) are presented in standardized scores that are also based on the total sample (with N = 1126, with M = 500 and SD = 100). The mean and SD of salary in the current study sample are presented by the total net income per year in normalized US dollars per year with purchasing power parity conversion factor (World Bank, 2015). Job complexity is represented as mean value of six items, whose ratings ranged from 1 (never) to 5 (every day) CPS complex problem-solving, Raven’s Raven’s standard progressive matrices
Author's personal copy J Bus Psychol
The research community is somewhat split regarding the utility of distinguishing GMA from fluid reasoning. McGrew (2005) considered GMA at the top of the CHC model to be virtually synonymous with fluid reasoning on the level below. Gustafsson (1988); Kvist & Gustafsson, 2008) concluded that GMA and fluid reasoning are in fact identical. Given the widespread acceptance of Raven’s SPM as a reliable and valid indicator of GMA (Arthur & Day, 1994; Carpenter, Just, & Shell, 1990; Frey & Detterman, 2004; Jensen, 1980) and its substantial g-loadings, we believe there is good justification for using this test here as a measure of GMA.
Salary Salary was assessed by directly asking respondents to indicate their total annual net income. To ensure comparability across different countries and currencies, income was transformed into normalized US dollars per year, employing the purchasing power parity conversion factor (World Bank, 2015). In this sample, the mean and standard deviation of the salary distribution were $35,434 and $25,474, respectively. A relatively small number of very high salaries (the maximum salary observed was $174,638) created a relatively large standard deviation and a slight skew, but the salary distribution, except for a few outliers, was reasonably normal.
Level of Education Respondents indicated their level of education using the international standard classification of education (ISCED; UNESCO Institute for Statistics, 2012) shown in Tables 1 and 2.
Job Complexity Examinees rated the frequency of different tasks that were related to the level of job complexity on a 5point Likert scale, with 1 (never), 2 (less than once a month), 3 (less than once a week but at least once a month), 4 (at least once a week but not every day), and 5 (every day). All six items were derived from the Federal Institute for Vocational Education and Training Survey (BIBB; Rohrbach-Schmidt & Hall, 2013) and asked respondents if they (a) make difficult decisions independently and without instructions; (b) collect, investigate, or document information at work; (c) have to recognize and close their own knowledge gaps; (d) have to perform many different tasks; (e) have to keep an eye on different work processes or sequences at the same time; and (f) face new tasks that they must think through and become familiar with. Ratings on these six items were combined to reflect the complexity of each incumbent’s job. The distribution of job complexity was reasonably normal, with a mean and standard deviation of M = 3.62 and SD = 0.88. It is possible that there are some employees whose work is usually routine but who from time to time perform complex tasks, and questions that focus on the frequency of complex tasks, such as in the present study, could misclassify jobs of this sort. Studies that have examined the relations between the various types of scales used in job-analytic questionnaires (e.g., frequency, importance, criticality of errors) have suggested that different types of rating scales generally lead to converging conclusions about major facets of the jobs being analyzed (Cadle, 2012; Conte, Dean, Ringenbach, Moran, & Landy, 2005), which in turn suggests that similar classifications of jobs according to their complexity could reasonably be expected if different rating scales had been employed.
Job Level Respondents stated their general job level in the latest version of a widely endorsed international standard classification of occupations 2008 (ISCO-08), which sorts jobs with respect to their content, salary levels, and opportunities for advancement (International Labour Office, 2012). First, we inverted the ISCO-08 ranking for the sake of clarity of results (i.e., high job levels have high scores in the current study). Next, we merged adjacent job levels on the low end of the scale with small sample sizes (e.g., elementary, low-skill occupations n = 12) into larger categories. As a result, jobs were coded as 1 = skilled agricultural workers, workers in crafts and related trades, plant and machine operators and assemblers, and elementary, low-skill occupations (n = 160); 2 = service and sales workers and clerical support workers (n = 85); 3 = technicians and associate professionals (e.g., business and administration associate professionals, information and communications technicians; n = 83); 4 = professionals (e.g., business and administration professionals, information and communications technology professionals, science and engineering professionals; n = 241); and 5 = managers (e.g., chief executives and senior officials, production and specialized services managers; n = 70, missing n = 34; Table 1). In the ISCO-08 taxonomy used here, Bprofessionals^ refers to a set of jobs that is typically situated in organizations at a level below managers and executives on organizational charts. ISCO-08 is therefore to be distinguished from American classification schemes that may consider Bprofessionals^ as a category that refers to doctors, lawyers, and so forth. In American classification schemes, these jobs may be a good deal more complex and better paid than some managerial jobs, but this was not the case in the present study, as salary comparisons revealed.4 4 Mean monthly salary per job level were $1,495.84 (SD = 292.27) on Level 1, $2,044.72 (SD = 1,740.99) on Level 2, $2,929.71 (SD\ = 1,565.23) on Level 3, $2,907.23 (SD = 1,377.89) on Level 4, and $5,916.80 (SD = 3,371.15) on Level 5 (i.e., Managers). This finding largely supports that ISCO-08 sorts jobs by salary levels, as suggested by the OECD (2013a).
Statistical Analyses We began our statistical analyses by computing descriptive statistics to determine whether our hypotheses were plausible. First, as in previous studies that have examined the relation between CPS and GMA, we found a substantial correlation between these two variables; the average of all of the MicroFIN and MicroDYN scores was correlated .68 with scores on Raven’s SPM (our GMA estimate), a correlation
Author's personal copy J Bus Psychol
that is in line with previously published studies that have concluded that CPS and GMA represent related but distinct constructs. Next, we examined the correlations between CPS and job complexity, job level, and salary and found correlations of .15, .27, and .36, respectively, all significant at the .01 level. Given these results, we proceeded to more rigorous analyses that provided direct tests of our hypotheses. To derive measurement models and evaluate our hypotheses, we used structural equation modeling (SEM; Bollen, 1989). First, we derived latent measurement models for all latent constructs (i.e., CPS, GMA, and job complexity) on the basis of confirmatory factor analyses. Then we calculated latent regression models in SEM for H1 and H2. In particular, we calculated a latent CPS residual that represented the portion of the CPS factor that was independent of the other two predictors: GMA and level of education. Because the latent CPS, GMA, and education measures were on average more highly intercorrelated than their observed counterparts, the creation of latent CPS residual scores as predictors that were independent of GMA and education provided a conservative and rigorous version of the same hypothesis (i.e., that CPS makes a unique and independent contribution), which could be tested in a hierarchical multiple regression using observed variables. The SEM-based methods used here allowed us to evaluate both the incremental contribution (in a hierarchical regression) and the fit of a model that presented an incremental role of CPS. We followed up this SEM-based test with more traditional hierarchical regression analyses to allow readers to more easily compare our results with previous regression-based studies and to provide easily interpretable effect size estimates. Evaluating Model Fit and Model Parameters Examining the goodness of fit of most of the models with the robust weighted least squares estimator (WLSMV) and one model with the robust maximum likelihood estimator (MLR), we evaluated the comparative fit index (CFI), the Tucker-Lewis index (TLI), the root mean square error of approximation (RMSEA), and the standardized root mean square residual (SRMR; only for MLR) with respect to their recommended cutoff values (Hu & Bentler, 1999). The WLSMV was applied because this estimator is appropriate for measurement and structural models with binary manifest data; for measurement models with data with five or more scale points (e.g., job complexity), we used the MLR (Rhemtulla, Brosseau-Liard, & Savalei, 2012). To calculate scale reliabilities, we used McDonald’s ωH (Zinbarg, Revelle, Yovel, & Li, 2005). The data had a nested structure, with data collected within the 21 organizations. Intraclass correlations (ICCs) of our measures ranged from .485 for salary to .547 for educational level. To account for this nested data structure, we used the Mplus 7.1 option TYPE=COMPLEX, which circumvented biases in standard errors and removed clustering effects (Muthén & Muthén 1998–2014).
Missing Data Because this study relied on tailored assessment suites (we tailored different subsets in accordance with each organization’s interests, data protection regulations, and time restrictions; e.g., Raven’s SPM was administered to 56.4% of our participants), there was a substantial amount of missing data built into our design as a result of time limitations imposed on testing by particular participating companies. This design is consistent with Graham’s (2009) recommendation that BBuilding missing data into the overall measurement design is the best use of limited resources^ (p. 551). There was also a small amount of missing data due to software errors while running tests or saving data. The proportion of missing data in the various measures ranged from 2.3% for the ISCED to 43.6% for Raven’s SPM. CPS data were missing for 6.2% of all participants. As noted earlier, Raven’s SPM data were missing for a large number of examinees because this test took the longest, and therefore, some companies removed this test from their assessment suite. For the same reason (i.e., differences from company to company in whether specific tests were administered or whether specific variables were measured), salary data were missing for 31% of all participants, job level data for 4.7%, job complexity data for 43.1%, and educational level data for 2.3%. Organizational membership information was missing for 2.4% of all participants. As noted earlier, we restricted analyses that included our measure of GMA (Raven’s SPM) to the subset of examinees (n = 394) who completed this measure.
Results Measurement Models and Reliabilities We tested the measurement models of our latent variables, CPS, GMA, and job complexity, and calculated the corresponding McDonald’s ωH. In building and testing our measurement models, we used all available data. For example, our measurement models were based on the total sample (N = 671), minus the number of cases with missing data on one or more CPS measures, whereas our measurement model for GMA was based on the subset of examines who completed Raven’s SPM. We used single manifest measures of level of education, salary, and job level. CPS We aimed to build CPS as a hierarchical factor in which CPS was represented by two assessments: MicroDYN and MicroFIN. First, we examined theoretically plausible models of MicroDYN and MicroFIN separately, testing solutions for each of the two, which were represented by their knowledge acquisition and knowledge application items (see Funke,
Author's personal copy J Bus Psychol
2010), against more parsimonious solutions, which did not distinguish between the knowledge acquisition and knowledge application items. For MicroDYN, the theoretically expected two-factor solution fit the data significantly better than a one-factor solution (Δχ2 = 316.021, p < .001; see the absolute fit indices in Table 4), confirming previous research (e.g., Greiff, Fischer, et al., 2013a). For MicroFIN, a two-factor solution was not significantly better than a more parsimonious one-factor solution (Δχ2 = 3.178, p = .075; see the absolute fit indices in models 1 to 4 in Table 4). Second, we combined the measurement models for MicroDYN and MicroFIN into one overarching, second-order CPS factor. This second-order CPS factor model was reliably measured (ωH = .89), fits the data well (see model 5 in Table 4), and is depicted in Fig. 3.4 GMA A single GMA factor fit the 31 items of the SPM well (see Table 4). We used parceling to reduce the number of parameters in this model by assigning the items to one of three balanced parcels using the item-to-construct balance recommended by Little, Cunningham, Shahar, and Widaman (2002). All parcels showed significant and substantial factor loadings on the GMA factor (λ > .85; all ps < .001) in a justidentified model. An unparceled solution also indicated a good model fit; the fit of this model 6 is shown in Table 4. Raven’s SPM in its raw score form is highly reliable (Gignac, 2015; Williams & McCord, 2006), and the factor score emerging from this analysis showed especially high reliability (ωH = .98). This high reliability estimate is in part a reflection of the factorial purity of the SPM; ωH estimates the proportion of variance in observed scores attributable to the latent variable measured by the scale (Zinbarg et al., 2005). Job Complexity An exploratory principal component analysis (PCA) in a previous study (Nedelkoska, Patt, & Ederer, 2015) using the full LLLight in Europe data set suggested that the job complexity items could be grouped into a single scale. We applied a confirmatory factor analysis to the job complexity scale and found that a single job complexity factor fit the data well (see model 7 in Table 4). Reliability was acceptable (ωH = .83). Correlations Manifest correlations as well as latent correlations showed that all variables (CPS, GMA, level of education, job complexity, job level, and salary) were positively and significantly correlated in line with the hypothesized predictors (Table 5). As noted earlier, CPS was associated with job complexity (r = .15, p < .001), job level (r = .27, p < .01), and salary (r = .36, p < .01). It was also correlated with level of education (r = .40, p < .01) and GMA (r = .68, p < .01). The observed correlation between GMA and CPS (r = .68) was high but consistent with meta-analytic estimates. For example, Stadler, Becker, Gödker, et al. (2015) reported a metaanalytic mean correlation of .58 between CPS tests of the sort
used here and measures of GMA. The correlation was also consistent with the results of several previous studies that showed that CPS could be meaningfully distinguished from GMA (Sonnleitner et al., 2013; Stadler, Becker, Greiff, & Spinath, 2015; Wüstenberg et al., 2012). The latent variable correlation between GMA and CPS was also very high (r = .81, r2 = .66) in part because it was corrected for various sources of variance, such as measurement and item-based error. More than 30% of the variance in the latent CPS scores was unrelated to latent GMA, leaving substantial room for a potential incremental contribution.
Latent Regressions We hypothesized a unique contribution of CPS in predicting job complexity (H1), job level, and salary (H2) beyond GMA and level of education, two highly established and efficient predictors. In order to investigate and quantify a unique contribution, we regressed CPS on GMA and educational level and used the resulting CPS residual, which represented the variance of CPS that was not shared with GMA and level of education, as a predictor variable. We employed this latent residual CPS score to predict the three criteria in H1 and H2, setting the direct effect of CPS on the criteria to zero, following the approach chosen by Wüstenberg et al. (2012). This latent regression model fitted the data well (see model 9 in Table 4) and is illustrated in Fig. 4. The resulting SEM allowed us to quantify the incremental statistical effect of CPS on the criteria as specified in H1 and H2. Preliminary Analyses of Control Variables First, GMA strongly predicted CPS (β = 0.75, SE = 0.04, p < .001; one-tailed pvalues are reported for model 9), whereas level of education did not predict CPS when GMA was considered (β = 0.12, SE = 0.10, p = .115; see Fig. 4). This link between CPS and educational level in our regression model was weak in comparison with the moderate zero-order correlation between CPS and educational level (r = .46, p = .005; depicted in Table 5), suggesting that a large part of this overlap was due to GMA. Second, and in line with previous research, level of education predicted job level (β = 0.83, SE = 0.09, p < .001), job complexity (β = 0.29, SE = 0.10, p = .002), and salary (β = 0.16, SE = 0.09, p = .048). When considered simultaneously with level of education, GMA did not predict job level (β = − 0.06, SE = 0.05, p = .135) or job complexity (β = 0.11, SE = 0.08, p = .082) and only weakly predicted salary (β = 0.14, SE = 0.05, p = .002; see Fig. 4). These links in the regression model between GMA and these criteria were somewhat weak compared with the small to moderate correlations in the preliminary analysis between GMA and salary (r = .21, p < .001), job complexity (r = .21, p = .006), and job level (r = .32, p < .001; depicted in Table 5). This decrease is likely due to
Author's personal copy J Bus Psychol Table 4 Goodness of fit indices for all models, including one- and two-factor solutions for CPS, solutions for Raven’s SPM, and job complexity, as well as the latent correlations and regression models that we used to test our hypothesis Model
N
χ2
df
p
CFI TLI RMSEA SRMR
1. MicroDYN 1 dimension
364 209.588
55 < .001 .982 .978
.088
–
2. MicroDYN 2 dimensions 3. MicroFIN 1 dimension
364 163.420 54 < .001 .987 .984 623 200.913 135 < .001 .990 .989
.075 .028
– –
4. MicroFIN 2 dimensions
623 200.032 134 < .001 .990 .989
.028
–
5. CPS
669 521.442 404 < .001 .977 .976
.021
–
6. Raven’s 7. Job complexity
394 426.393 377 391 11.22 8
.04 .957 .953 .189 .998 .997
.018 .032
– .023
8. Latent correlations
394 879.882 806
.036 .969 .966
.015
–
9. Latent regression of job complexity, job level, and salary on raven’s, education, and CPS 10. Latent partial correlation between CPS (controlled for Raven’s and education) and salary (controlled for job level) 11. Imputation of the latent regression in model 9 with all available data
394 928.406 847
.026 .965 .962
.016
–
394 944.085 847
.011 .962 .959
.013
–
671 909.029 805
.005 .960 .957
.014
–
χ2 and df estimates are based on MLR (model 7: job complexity) or WLSMV (all other models) MicroDYN 1 one-factor solution of the MicroDYN test, MicroDYN 2 two-factor solution of the MicroDYN test, MicroFIN 1 one-factor solution of the MicroFIN test, MicroFIN 2 two-factor solution of the MicroFIN test, CPS complex problem-solving modeled as a second-order hierarchical factor, Raven’s Raven’s standard progressive matrices, unparceled solution, df degrees of freedom, CFI comparative fit index, TLI Tucker-Lewis index, RMSEA root mean square error of approximation, CI confidence interval, SRMR standardized root mean square residual (only for MLR)
the variance shared between GMA and educational level (r = .46, p < .001) in this model.
H1: CPS Predicts Job Complexity Beyond GMA and Level of Education The latent residual of CPS incrementally predicted job complexity, with a significant value of β = 0.26 (SE = 0.12, p = .014) beyond level of education and GMA, supporting H1. As this CPS residual was statistically independent from GMA and educational level, the square of its path coefficient indicates the amount of variance in the criteria uniquely Fig. 3 Hierarchical factor of CPS on the basis of MicroDYN and MicroFIN. N = 669. The path coefficient between the hierarchical CPS and the CPS factor of MicroFIN is set to 1 and its variance to 0 to allow for a stable model estimation (for details see the BMethod^ section). CPS, complex problem-solving modeled as a second-order hierarchical CPS factor; CPS by MicroFIN, one-factor solution on the basis of MicroFIN; CPS by MicroDYN, two-factor solution on the basis of MicroDYN. Twotailed p values: *p < .05, **p < .01, ***p < .001
explained by CPS, revealing a significant effect size of R2 = .07. That is, the part of CPS that is independent of GMA explained 7% of the variance in job complexity.
H2: CPS Predicts Job Level and Salary Beyond GMA and Level of Education Job level and salary indicate job success, and if CPS matters for job success, it should be related to job level and salary beyond cognitive ability and education. The latent residual CPS score predicted salary (β = 0.16, SE = 0.05, p < .001) above and beyond level of education and GMA, accounting
Author's personal copy J Bus Psychol Table 5 SEM-based latent correlations and manifest correlations between observed measures, as well as reliability (ωH) on the diagonal
(1)
(2)
(3)
(4)
(5)
(6)
(1) CPS
.89
.68**(366)
.40**(645)
.27** (619)
.36**(451)
.15**(539)
(2) Raven’s (3) Education
.81*** .46**
.97 .46***
.44** (387) –
.35** (377) .79** (644)
.21**(276) .30** (474)
.16**(307) .18**(562)
–
.46** (457)
.23**(538)
–
.15**(473)
(4) Job level
.25*
.32**
.80***
(5) Salary
.28**
.21***
.22**
.39***
(6) Job complexity
.32***
.21**
.27**
.30*
.20***
.83
If applicable, McDonald’s ωH is also reported in italic as a measure of reliability; ωH of CPS as a hierarchical factor (see Fig. 3) n = 394 for latent constructs and n between 276 and 645 for observed measures. Correlations of observed measures are reported above the diagonal, with sample sizes in parentheses. Model-based latent correlations are reported below the diagonal CPS complex problem-solving, Education international standard classification of education (ISCED), Job level international standard classification of occupations 2008 (ISCO-08 [inverse scoring]), Salary total yearly net income normalized with purchasing power parity conversion factor (World Bank, 2015), Job complexity latent variable of six complexity items (see the BMethod^ section) *p < .05, **p < .01, ***p < .001 (two-tailed)
for an incremental 3% of the variance in salary, supporting H1. In opposition to H2, CPS failed to account for incremental variance in job level (β = − 0.15, SE = 0.09, p = .051; see Fig. 4). Taken together, predictions regarding the incremental contribution of CPS beyond GMA and level of education in predicting occupational success were supported for salary but not for job level. The Incremental Contribution of CPS Is Not an Artifact of Job Level Given that higher level jobs usually pay better (e.g., Guthrie et al., 2015; Jensen, 1980; Milkovich et al., 2013; in the current study, the observed correlation between job level
Fig. 4 Model 9 reveals the incremental contribution of CPS (complex problem-solving) over GMA as indicated by Raven’s (Raven’s standard progressive matrices) and level of education in predicting job level, salary, and job complexity. n = 394, who completed Raven’s SPM. Level of education, Raven’s, and the residual of CPS (CPSres), which is unrelated to Raven’s and level of education, are statistical predictors of job level, salary, and job complexity. The numbers in parentheses are the standard errors. One-tailed pvalues: *p < .05, **p < .01, ***p < .001
and salary was r = .46), it is possible that the relation between residual CPS and salary is an artifact of job level. To determine whether the incremental contribution of CPS in predicting salary was an artifact of the different salaries and job duties associated with jobs at higher versus lower level jobs, we calculated the latent partial correlation between the CPS residual and the parts of salary that are independent of job level. The respective latent model fitted the data well (see model 10 in Table 4) and revealed a significant latent partial correlation between residual CPS and salary independent of job level (r = .38, p < .001), suggesting that even after controlling for both GMA and level of education on the predictor side and for
Author's personal copy J Bus Psychol
occupational level on the criterion side, CPS still contributed to the prediction of salary. Robustness Checks and Multiple Imputations As noted earlier, the latent regression analyses, which included GMA as a control, were based on the subset (n = 394) of examinees who completed Raven’s SPM. To evaluate the potential effects of missing data in the latent regression analyses, we first compared the scores of examinees with (n = 277) and without (n = 394) missing data on Raven’s SPM on all of the key variables in this study. We found only one small but significant relation between subsample membership and job complexity, and even in this case, subsample membership accounted for no more than 1% of the variance in complexity scores. None of the other key variables were significantly different when we compared the Raven’s SPM subsample with the full sample. This pattern of results suggests that the two subsamples are largely similar and that analyses based on the subsample of examinees who completed the SPM can be legitimately generalized to the full sample. The analyses described in the preceding paragraph represented our first assessment of similarities and differences in the sample that completed Raven’s SPM and the total sample. Next, we reran the latent regressions in the full N = 671 sample and applied multiple imputation in MPlus 7.1 for missing data on categorical and continuous variables (see model 11 in Table 4). We imputed the N = 671 sample 40 times, as recommended by Graham (2009). Multiple imputation is the method of choice because it is generally considered to be as good as more traditional methods (e.g., listwise deletion) but typically better and often a great deal better (Graham, 2009). Robustness checks involving a comparison between the imputed data and the original data without imputationrevealed virtually no differences. That is, the pattern of results with imputation was similar to the results obtained without imputation. Hierarchical Regressions Show Similar Results The residualized SEM methods used here have not been widely used in similar studies, and to facilitate the interpretation of our data and comparisons with findings from other similar studies, we also analyzed the predictions in H1 and H2 using hierarchical regression on a manifest level. As such, we also tested whether our results from a SEM approach (i.e., with and without multiple imputation to handle missingness) held when using more conventional statistical approaches (i.e., manifest hierarchical regressions). That is, in these analyses, we used pairwise deletion, and we used all available cases in which pairs of predictor and criterion variables were available. First, the three criterion variables (complexity, level, salary) were regressed on GMA and level of education (step 1) before CPS was entered (step 2). Therefore, tests of the incremental contribution of CPS first controlled for GMA and education.
Similar to our latent regression model reported above, CPS significantly explained incremental variance in salary (β = 0.25), t(252) = 2.90, p = .004, n = 256, and job complexity (β = 0.30), t(213) = 3.46, p = .001, n = 217, with a significant Δ R 2 s a l a r y = .0 3, Δ F( 1, 2 52 ) = 8.4 3, p = .0 04 , an d ΔR2complexity = .05, ΔF(1, 225) = 42.64, p < .001. The incremental contribution of CPS over GMA and education was small but significant for the job level variable (β = − 0.12), t(352) = 2.93, p = .004, n = 356, with ΔR2job level = .01, ΔF(1, 352) = 8.62, p = .004. In sum, we found strong convergence between the results of the latent variable regressions with and without imputation and the results of hierarchical observed-variable regressions.
Discussion It has been suggested (e.g., OECD, 2013a, b) that CPS skills are important in the workplace. In this study, we used rigorous and conservative tests to evaluate the incremental contribution of CPS skills in distinguishing among individuals who are currently employed in jobs that differ in complexity, level, and salary. We combined the best available measures of CPS, modeled on the highly successful PISA assessments, with a well-validated measures of fluid reasoning (i.e., Raven’s SPM) to demonstrate that CPS skills do indeed make a contribution, over and above GMA, in predicting these criteria (see Greiff, Fischer, et al., 2013a; Schweizer et al., 2013; Stadler, Becker, Greiff, & Spinath, 2015; Wüstenberg et al., 2012, for illustrations of similar methods of controlling for GMA in educational settings). Our results suggest that CPS skills play a distinct and incremental role in occupational gravitation, despite the high correlation between CPS and GMA. Individuals with stronger CPS skills are more likely to occupy and to succeed in complex jobs, independent of the effects of GMA or education. Going beyond previous studies, such as Ederer et al. (2015), we used SEM to construct a latent CPS residual score that is fully independent from GMA and level of education and showed that CPS makes a unique and relatively important contribution in predicting salary and job complexity. Our pattern of results are consistent with the assumption that CPS represents unique aspects of complex cognition in the form of higher order thinking skills that are needed to deal with dynamic, nonroutine, and interactive tasks that are (a) characteristic of modern jobs and (b) not fully captured by strong predictors such as GMA and educational level (Funke, 2010). A number of studies have shown that GMA can be used to sort individuals into jobs that are more or less cognitively demanding (Desmarais & Sackett, 1993; Gottfredson, 1986, Gottfredson, 2003; Jensen, 1980; Wilk et al., 1995). Our results suggest a comparable role for CPS skills; individuals with higher CPS skills are more likely to be found in and to
Author's personal copy J Bus Psychol
be successful in complex jobs. Taken together, our results provide empirical support for the recommendations of the OECD and the National Research Council that investing in the development of CPS skills is likely to have payoffs in the workplace. In contrast to GMA, a well-established predictor of occupational gravitation, career success, and job performance, CPS skills can be meaningfully enhanced with training and practice, and the development of these skills may open more doors for students who are entering job markets in which an increasing proportion of jobs require them to solve complex and unstructured problems. Understanding the Occupational Sorting Process Understanding how specific skills such as CPS contribute to the process of sorting people into occupations is likely to present many challenges. It is likely that employers have at least some information about GMA when they make decisions about job applicants; proxies such as educational level and attainment are often considered in making selection decisions. It is less likely that employers will routinely have reliable information about specific skills such as CPS when selecting applicants, thus implying that processes other than selection might play an important role in sorting people into occupations that are consistent with their CPS skills, independent of their GMA levels. Schneider’s (1987; Schneider, Goldstein, & Smith, 1995) attraction-selection-attrition framework provides some useful ideas for structuring research on the process by which people are sorted into occupations that match their abilities or skills. First, we must understand how people become attracted to jobs that are consistent with their CPS skills. Can people realistically evaluate their own CPS skills and the CPS skills needed to thrive in different occupations? When people make their initial choices about what occupations to pursue (including choices about taking on the educational requirements for different types of careers), are their preferences and choices influenced by whether their CPS skills meet the CPS requirements of the job or occupation? Do their ideas about appropriate occupations change as their CPS skills develop? Second, we do not know what role, if any, CPS skills play in most selection decisions. It is likely that some assessments provide information about CPS skills, but the reliability, validity, and perceived value of that information is not wellestablished. Finally, we need to understand the potential role of CPS skills in attrition. For example, it seems likely that if a job or occupation requires CPS skills that are substantially more advanced than the skills that incumbents possess (or are likely to be able to develop), the likelihood of voluntary or even involuntary attrition is likely to increase. At the beginning of this paper, we noted that we would use the term occupational sorting to refer to a process that might unfold over time and that might involve a mixture of individual preferences, situational constraints, opportunities, and
sheer happenstance. The data analyzed here suggest that the complexity of the occupations individuals end up in and their relative success (measured by salary, adjusting for level) in these occupations can be predicted on the basis of the part of CPS that is independent of GMA and education. To gain some insight into the possible roles of initial selection into a job or occupation versus the cumulative effects of changes that occur over time in the labor market, we calculated the latent partial correlation between CPS and job complexity, controlling for GMA. Approximately 20% of the sample was 21 years old or younger and was therefore unlikely to have extensive experience in the labor market, and for these participants, there was virtually no correlation between CPS and job complexity when GMA was controlled for (r = − .03). By contrast, for individuals over age 30, who are likely to have more opportunities to change jobs (or more likelihood of being pushed out of jobs that do not suit them), there was a substantial correlation between CPS and complexity, when GMA was controlled for (r = .52). These results suggest that temporal processes may indeed be more important determinants of occupational sorting than initial selection and placement, but these results must be considered both crude and preliminary. We encourage researchers to examine in more detail how this occupational sorting process unfolds over time in future studies.
Strengths and Limitations This study is the first to link skills in CPS to multiple indicators of occupational success, while unambiguously controlling for both GMA and education in a large and diverse multinational sample of employees that spanned a range of jobs and occupations. Our statistical approach allowed us to directly evaluate the contribution of the part of CPS that is strictly independent of GMA and educational level to a number of occupational criteria not examined in previous studies (Ederer et al., 2015). However, there are limitations to the conclusions that we have drawn from the present study. First, with a latent correlation of r = .81 between CPS and GMA, readers might question whether these constructs are separable. The large overlap between CPS and GMA measures observed here is consistent with previous research that demonstrated strong g-loadings for CPS tests (Kretzschmar, Neubert, Wüstenberg, & Greiff, 2016; Stadler, Becker, Gödker, et al., 2015), but our findings are also consistent with the conclusions reached in several previous studies that GMA and CPS are both conceptually and empirically distinct (Wüstenberg et al., 2012; Sonnleitner et al., 2013). In particular, despite the substantial correlation between our latent GMA and CPS measures, we were still able to demonstrate significant increases in predictive power when a residual CPS score that was corrected for shared variance with both GMA and education was added to the regression models.
Author's personal copy J Bus Psychol
Second, we were not able to collect job performance data. Future studies on the role of CPS in gravitation will benefit from the inclusion of measures of job performance and effectiveness. Third, the effect sizes for the residual CPS scores we found were relatively small, with increments in R2 values of 3% for salary and 7% for job complexity. Of course, it is notable to find any incremental contribution beyond GMA and educational level at all (e.g., Ree et al., 2015; Ree et al., 1994), given that (a) GMA and level of education are among the most powerful single predictors of occupational outcomes (e.g., Ng et al., 2005; Schmidt & Hunter, 1998) and (b) measures of GMA usually subsume measures of other specific cognitive abilities (e.g., Ree et al., 2015). Nevertheless, although the incremental contribution of CPS was statistically significant, it was not necessarily large. Fourth, the data presented here clearly did not allow us to test the temporal component of the gravitation hypothesis (i.e., as people change jobs, they tend to move toward jobs that fit their abilities) because a cross-sectional research design does not allow for temporal predictions or conclusions about causality. Our results are consistent with but cannot be used to confirm theories that assign a causal role to CPS in the tendency to gravitate toward specific types of jobs (e.g., Murphy, 1989). For example, there is evidence that complex jobs stimulate and facilitate the development of cognitive skills into old age (Schooler, Mulatu, & Oates, 1999; Smart et al., 2014), and it is possible that causation runs in the opposite direction suggested by a gravitation model (i.e., being in a complex job leads to higher CPS skills). Given that we showed that people tend to end up sorted into jobs in a way that can be predicted by CPS levels, it may be fruitful to collect longitudinal data that will allow researchers to track movement in the job market over time (cf. Wilk et al., 1995) in relation to CPS skill levels and to more fully test causal and reciprocal relations between CPS and the likelihood of occupying well-paying, complex jobs higher up in the occupational hierarchy. Fifth, the application of SEM to cross-sectional data is also based on the assumption of linear relations between variables, a presumption that might not hold true in a working world that has been repeatedly criticized for paying disproportionally high wages to managers (thus establishing nonlinear relations between job characteristics and salary; Neal & Rosen, 2000). We excluded extraordinary wages in the current study, but it is possible that including a larger number of individuals at the highest job levels in this study would have revealed nonlinear as well as linear trends. Sixth, examinees might not put their best efforts into some tests (e.g., Raven’s) if the stakes are relatively small. In this study, examinees’ jobs or their opportunities for promotion did not depend on doing well on particular tests, and it is plausible that participants did not invest as much effort into performing well on these tests. The CPS tests are unique and
engaging, and examinees may put more effort into and pay more attention to these tests than to Raven’s SPM, a test that some examinees find difficult and frustrating. We note, however, that scores on Raven’s SPM were highly reliable and highly correlated with CPS scores, something that we would not expect if examinees were putting minimal effort into the test. Seventh, we were not able to draw a randomized sample but instead had to rely on companies’ willingness to participate in a time-consuming research project, a circumstance that resulted in a large but selective convenience sample. Finally, the amount of missing data was very high for the key variable Raven’s SPM and job complexity, owing to practical constraints at the company level (see the BSample and Procedure^ section in the BMethod^ section). We took several steps to account for missing data. We ran analyses that included Raven’s SPM with a subset of examinees (n = 394) who completed this measure. Next, we evaluated the potential effects of missing data with subset comparisons on the key variables, multiple imputations, robustness checks of imputed data, and a reanalysis in which we applied hierarchical regression on a manifest level using all available data, and we found largely no effects of missing data. Nevertheless, it is possible that our results may have been different if there had been no missing data.
Conclusions Within certain constraints of cross-sectional data and missing data, our study provides initial evidence that CPS is relevant in the workplace and is partly independent of GMA and level of education, two of the best predictors of success in work life (for reviews, see Brand, 1987; Gottfredson, 2002; Ng & Feldman, 2010). The unique links between CPS and success shown in the conservative latent residual approach presented here (Wüstenberg et al., 2012) are an important exception to the frequent claim in I-O psychology that not much more than GMA is needed to make predictions about job performance and career success (Ree et al., 2015). CPS skills represent more than a simple set of strategies learned during formal education. The incremental contributions of CPS shown here in explaining salary and job complexity despite high correlations with GMA suggest that the construct of CPS includes unique higher order thinking skills in acquiring and applying new knowledge and that these skills matter in increasingly complex and nonroutine jobs and have previously not been captured in I-O psychology (Neubert et al., 2015). Thus, CPS may contribute to a more comprehensive picture of complex cognition in the workplace. If technology and organizational change continue to race ahead, the increasing complexity of jobs is likely to further enhance the role of CPS skills in the workplace, possibly creating an
Author's personal copy J Bus Psychol
increasingly important set of employment opportunities for individuals who manage to develop these skills. Funding This research was funded by a grant from the Fonds National de la Recherche Luxembourg (ATTRACT BASK21^), and the European Union (290683; LLLight’in’Europe). We gratefully acknowledge the assistance of Silvia Castellazzi, André Kretzschmar, Jonas Neubert, and Alexander Patt, who aided in collecting the data reported here.
Disclaimer Samuel Greiff is one of two authors of the commercially available COMPRO-test that is based on the multiple complex systems approach and that employs the same assessment principle as MicroDYN, and he receives royalty fees for COMPRO. The COMPRO test was not used in this study, but its similarities to MicroDYN are substantial. For any research and educational purpose, a free version of MicroDYN is available.
References Ackerman, P. L., Beier, M. E., & Boyle, M. O. (2005). Working memory and intelligence: The same or different constructs? Psychological Bulletin, 131, 30–60. https://doi.org/10.1037/0033-2909.131.1.30. Arthur, W., & Day, D. V. (1994). Development of a short form for the raven progressive matrices test. Educational and Psychological Measurement, 54, 394–403. Autor, D. H., Levy, F., & Murnane, R. J. (2003). The skill content of recent technological change: An empirical exploration. The Quarterly Journal of Economics, 118, 1279–1333. https://doi.org/ 10.1162/003355303322552801. Autor, D. H., Katz, L. F., & Kearney, M. S. (2006). The polarization of the U.S. labor market. American Economic Review, 96, 189–194. https://doi.org/10.1257/000282806777212620. Bakken, B. E. (1993). Learning and transfer of understanding in dynamic decision environments. Cambridge: Massachusetts Institute of Technology. Barnett, W. S. (1995). Long-term effects of early childhood programs on cognitive and school outcomes. The Future of Children, 5, 25–50. Becker, S. O., Ekholm, K., & Muendler, M.-A. (2013). Offshoring and the onshore composition of tasks and skills. Journal of International Economics, 90, 91-106. https://doi.org/10.1016/j.jinteco.2012.10. 005. Binkley, M., Erstad, O., Herman, J., Raizen, S., Ripley, M., Miller-Ricci, M., & Rumble, M. (2012). Defining 21st century skills. In P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st century skills (pp. 17–66). Dordrecht: Springer. Retrieved from http://www.springerlink.com. Bollen, K. A. (1989). Structural equations with latent variables. New York: Wiley. Brand, C. (1987). The importance of general intelligence. In S. Modgil & C. Brand (Eds.), Arthur Jensen: Consensus and controversy (pp. 251–265). New York, NY: Falmer Bretz, R. D., & Judge, T. A. (1994). Person-organization fit and the theory of work adjustment: Implications for satisfaction, tenure, and career success. Journal of Vocational Behavior, 44, 32–54. https://doi.org/ 10.1006/jvbe.1994.1003. Cadle, Adrienne W.(2012), The relationship between rating scales used to evaluate tasks from task inventories for licensure and certification examinations. Graduate theses and dissertations. http:// scholarcommons.usf.edu/etd/4296 Carpenter, P. A., Just, M. A., & Shell, P. (1990). What one intelligence test measures: A theoretical account of the processing in the Raven Progressive Matrices test. Psychological Review, 97, 404–431.
Carroll, J. B. (1993). Human cognitive abilities: A survey of factoranalytic studies. New York: Cambridge University Press. Cascio, W. F. (1995). Whether industrial and organizational psychology in a changing world of work? American Psychologist, 50, 928–939. https://doi.org/10.1037/0003-066X.50.11.928. Conte, J. M., Dean, M. A., Ringenbach, K. L., Moran, S. K., & Landy, F. J. (2005). The relationship between work attitudes and job analysis ratings: Do rating scale type and task discretion matter? Human Performance, 18, 1–21. Converse, P. D., Piccone, K. A., & Tocci, M. C. (2014). Childhood selfcontrol, adolescent behavior, and career success. Personality and Individual Differences, 59, 65–70. https://doi.org/10.1016/j.paid. 2013.11.007. Danner, D., Hagemann, D., Schankin, A., Hager, M., & Funke, J. (2011). Beyond IQ. A latent state trait analysis of general intelligence, dynamic decision making, and implicit learning. Intelligence, 39, 323– 334. https://doi.org/10.1016/j.intell.2011.06.004. Desmarais, L. B., & Sackett, P. R. (1993). Investigating a cognitive complexity hierarchy of jobs. Journal of Vocational Behavior, 43, 279– 297. https://doi.org/10.1006/jvbe.1993.1048. Ederer, P., Nedelkoska, L., Patt, A., & Castellazzi, S. (2015). What do employers pay for employees’ complex problem solving skills? International Journal of Lifelong Education, 34, 430–447. https:// doi.org/10.1080/02601370.2015.1060026. Frensch, P. A., & Funke, J. (1995). Definitions, traditions, and a general framework for understanding complex problem solving. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European perspective (p. 14). Hillsdale, NJ: Erlbaum. Frey, M. C., & Detterman, D. K. (2004). Scholastic assessment org? Psychological Science, 15, 373–378. Funke, J. (2001). Dynamic systems as tools for analysing human judgement. Thinking & Reasoning, 7, 69–89. https://doi.org/10.1080/ 13546780042000046. Funke, J. (2010). Complex problem solving: A case for complex cognition? Cognitive Processing, 11, 133–142. https://doi.org/10.1007/ s10339-009-0345-0. Gignac, G. E. (2015). Raven’s is not a pure measure of general intelligence: Implications for g theory and the brief measurement of g. Intelligence, 52, 72–79. Gonzalez, C., Vanyukov, P., & Martin, M. K. (2005). The use of microworlds to study dynamic decision making. Computers in Human Behavior, 21, 273–286. https://doi.org/10.1016/j.chb.2004.02.014. Goos, M., Manning, A., & Salomons, A. (2009). Job polarization in Europe. The American Economic Review, 99, 58–63. https://doi. org/10.1257/aer.99.2.58. Gottfredson, L. S. (1986). Occupational aptitude patterns map: Development and implications of a theory of job aptitude requirements. Journal of Vocational Behavior, 29, 254–291. https://doi.org/ 10.1016/0001-8791(86)90008-4. Gottfredson, L. S. (2002). Where and why g matters: Not a mystery. Human Performance, 15(1–2), 25–46. https://doi.org/10.1080/ 08959285.2002.9668082. Gottfredson, L. S. (2003). g, jobs and life. In H. Nyborg (Ed.), The scientific study of general intelligence: Tribute to Arthur R. Jensen (pp. 293-342). Amsterdam: Pergamon. Greiff, S., Wüstenberg, S., & Funke, J. (2012). Dynamic problem solving: A new assessment perspective. Applied Psychological M e a s u re m e n t , 3 6 , 1 8 9 – 2 1 3 . h t t p s : / / d o i . o rg / 1 0 . 11 7 7 / 0146621612439620. Greiff, S., Fischer, A., Wüstenberg, S., Sonnleitner, P., Brunner, M., & Martin, R. (2013a). A multitrait-multimethod study of assessment instruments for complex problem solving. Intelligence, 41, 579– 596. https://doi.org/10.1016/j.intell.2013.07.012. Greiff, S., Holt, D. V., & Funke, J. (2013b). Perspectives on problem solving in educational assessment: Analytical, interactive, and
Author's personal copy J Bus Psychol collaborative problem solving. Journal of Problem Solving, 5, 71– 91. https://doi.org/10.7771/1932-6246.1153. Gustafsson, J.- E. (1988). Hierarchical models of individual differences in cognitive abilities. In R. J. Sternberg (Ed.), Advances in the psychology of human intelligence (Vol. 4, pp. 35–71). Hillsdale, NJ: Erlbaum. Gustafsson, J. E. (2002). Measurement from a hierarchical point of view. In H. L. Braun, D. G. Jackson, & D. E. Wiley (Eds.), The role of constructs in psychological and educational measurement (pp. 73– 95). Mahwah, NJ: Erlbaum. Guthrie, P., Dumay, A., Massingham, P., & Tam, L. (2015). The relationship between human capital, value creation and employee reward. Journal of Intellectual Capital, 16, 390–418. https://doi.org/10. 1108/JIC-06-2014-0075. Haier, R. J. (2014). Increased intelligence is a myth (so far). Frontiers in Systems Neuroscience, 8, 34. Hampel, F. R. (1974). The influence curve and its role in robust estimation. Journal of the American Statistical Association, 69, 383–393. https://doi.org/10.1080/01621459.1974.10482962. Hertzog, C., & Schaie, K. W. (1988). Stability and change in adult intelligence: 2. Simultaneous analysis of longitudinal means and covariance structures. Psychology and Aging, 3, 122–130. Hoffman, B. (2016). The Changing Nature of Work: Evidence and Implications. Symposium Presentation at The 31st Annual Society of Industrial and Organisational Psychology (SIOP) Conference in Anaheim, CA, US, 14.04. - 16.04.2016. Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1–55. https://doi.org/ 10.1080/10705519909540118. International Labour Office. (2012). International Standard Classification of Occupations: ISCO-08. Geneva: International Labour Office. Jaskolka, G., Beyer, J. M., & Trice, H. M. (1985). Measuring and predicting managerial success. Journal of Vocational Behavior, 26, 189–205. https://doi.org/10.1016/0001-8791(85)90018-1. Jensen, A. (1980). Bias in mental testing. New York: Free Press. Jensen, A. R. (1998). The g factor and the design of education. In R. J. Sternberg & W. M. Williams (Eds.), Intelligence, instruction, and assessment. Theory into practice (pp. 111–131). Mahwah, NJ: Lawrence Erlbaum Associates. Jensen, E. (2005). Learning and transfer from a simple dynamic system. Scandinavian Journal of Psychology, 46, 119–131. https://doi.org/ 10.1111/j.1467-9450.2005.00442.x. Judge, T. A., & Kammeyer-Mueller, J. D. (2012). General and specific measures in organizational behavior research: Considerations, examples, and recommendations for researchers. Journal of Organizational Behavior, 33, 161–174. Judge, T. A., Klinger, R. L., & Simon, L. S. (2010). Time is on my side: Time, general mental ability, human capital, and extrinsic career success. Journal of Applied Psychology, 95, 92–107. https://doi. org/10.1037/a0017594. Kell, H. J., Lubinski, D., & Benbow, C. P. (2013). Who rises to the top? Early indicators. Psychological Science, 24, 648–659. Kretzschmar, A., & Süß, H.-M. (2015). A study on the training of complex problem solving competence. Journal of Dynamic Decision Making, 1, 1–16. https://doi.org/10.11588/jddm.2015.1.15455. Kretzschmar, A., Neubert, J. C., Wüstenberg, S., & Greiff, S. (2016). Construct validity of complex problem solving: A comprehensive view on different facets of intelligence and school grades. Intelligence, 54, 55–69. https://doi.org/10.1016/j.intell.2015.11.004. Kvist, A. V., & Gustafsson, J.- E. (2008). The relation between fluid intelligence and the general factor as a function of cultural background: A test of Cattell’s investment theory. Intelligence, 36, 422–436. https://doi.org/10.1016/j.intell.2007.08.004.
Leys, C., Ley, C., Klein, O., Bernard, P., & Licata, L. (2013). Detecting outliers: Do not use standard deviation around the mean, use absolute deviation around the median. Journal of Experimental Social Psychology, 49, 764–766. https://doi.org/10.1016/j.jesp.2013.03. 013. Lievens, F., & Patterson, F. (2011). The validity and incremental validity of knowledge tests, low-fidelity simulations, and high-fidelity simulations for predicting job performance in advanced-level highstakes selection. Journal of Applied Psychology, 96, 927–940. https://doi.org/10.1037/a0023496. Lievens, F., & Reeve, C. L. (2012). Where I-O psychology should really (re)start its investigation of intelligence constructs and their measurement. Industrial and Organizational Psychology: Perspectives on Science and Practice, 5, 153–158. Little, T. D., Cunningham, W. A., Shahar, G., & Widaman, K. F. (2002). To parcel or not to parcel: Exploring the question, weighing the merits. Structural Equation Modeling, 9, 151–173. https://doi.org/ 10.1207/S15328007SEM0902_1. LLLight in Europe Project. (2015a). Complex problem solving: A promising candidate for facilitating the acquisition of job skills. Retrieved from http://www.lllightineurope.com/policy-briefs. LLLight in Europe Project. (2015b). Enterprises are greatly important for lifelong learning activities. Retrieved from http://www. lllightineurope.com/policy-briefs. LLLight in Europe Project. (2015c). Synthesis report. Retrieved from http://www.lllightineurope.com/reports/ Lubinski, D., Benbow, C. P., & Kell, H. J. (2014). Life paths and accomplishments of mathematically precocious males and females four decades later. Psychological Science, 25, 2217–2232. Makel, M. C., Kell, H. J., Lubinski, D., Putallaz, M., & Benbow, C. P. (2016). When lightning strikes twice: Profoundly gifted, profoundly accomplished. Psychological Science, 27, 1004–1018. McCormick, E. J., Jeanneret, P. R., & Mecham, R. C. (1972). A study of job characteristics and job dimensions as based on the Position Analysis Questionnaire (PAQ). Journal of Applied Psychology, 56(4), 347–368. https://doi.org/10.1037/h0033099. McCormick, E. J., DeNisi, A. S., & Shaw, J. B. (1979). Use of the position analysis questionnaire for establishing the job component validity of tests. Journal of Applied Psychology, 64, 51–56. https:// doi.org/10.1037/0021-9010.64.1.51. McGrew, K. S. (2005). The Cattell-Horn-Carroll theory of cognitive abilities: Past, present, and future. In D. P. Flanagan & P. L. Harrison (Eds.), Contemporary intellectual assessment: Theories, test, and issues (2nd ed., pp. 136–181). New York, NY: Guilford Press. McGrew, K. S. (2009). CHC theory and the human cognitive abilities project: Standing on the shoulders of the giants of psychometric intelligence research. Intelligence, 37, 1–10. Middleton, H. (2002). Complex problem solving in a workplace setting. International Journal of Educational Research, 37, 67–84. https:// doi.org/10.1016/S0883-0355(02)00022-8. Milkovich, G. T., Newman, J. M., & Gerhart, B. (2013). Compensation (11th ed.). New York: McGraw-Hill. Molnár, G., Greiff, S., & Csapó, B. (2013). Inductive reasoning, domain specific and complex problem solving: Relations and development. Thinking Skills and Creativity, 9, 35–45. https://doi.org/10.1016/j. tsc.2013.03.002. Murphy, K. R. (1989). Is the relationship between cognitive ability and job performance stable over time? Human Performance, 2, 183– 200. https://doi.org/10.1207/s15327043hup0203_3. Murphy, K. R. (1996). Individual differences and behavior in organizations. San Francisco: Jossey-Bass. Muthén, L., & Muthén, B. (1998-2014). Mplus 7.1. [computer software]. Los Angeles, CA: Muthén & Muthén. National Research Council (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century.
Author's personal copy J Bus Psychol (J. W. Pellegrino, & M. L. Hilton, Eds.). Washington, DC: The National Academies Press. Neal, D., & Rosen, S. (2000). Theories of the distribution of earnings. In A. B. Atkinson & F. Bourguignon (Eds.), Handbook of income distribution (Vol. 1, pp. 379–427). Amsterdam: Elsevier Science Retrieved from http://linkinghub.elsevier.com/retrieve/pii/ S157400560080010X. Nedelkoska, L., Patt, A., & Ederer, P. (2015). Learning by problem solving. Retrieved from http://ssrn.com/abstract=2673990 Neubert, J. C., Kretzschmar, A., Wüstenberg, S., & Greiff, S. (2014). Extending the assessment of complex problem solving to finite state automata—embracing heterogeneity. European Journal of Psychological Assessment, 31, 181–194. https://doi.org/10.1027/ 1015-5759/a000224. Neubert, J. C., Mainert, J., Kretzschmar, A., & Greiff, S. (2015). The assessment of 21st century skills in industrial and organizational psychology: Complex and collaborative problem solving. Industrial and Organizational Psychology, 8, 238–268. https://doi. org/10.1017/iop.2015.14. Ng, T. W. H., & Feldman, D. C. (2010). Human capital and objective indicators of career success: The mediating effects of cognitive ability and conscientiousness. Journal of Occupational and Organizational Psychology, 83, 207–235. https://doi.org/10.1348/ 096317909X414584. Ng, T. W. H., Eby, L. T., Sorensen, K. L., & Feldman, D. C. (2005). Predictors of objective and subjective career success: A meta-analysis. Personnel Psychology, 58, 367–408. https://doi.org/10.1111/j. 1744-6570.2005.00515.x. OECD. (2012). Better skills, better jobs, better lives. In A strategic approach to skills policies. Paris: OECD Publishing. OECD. (2013a). OECD skills outlook 2013: First results from the survey of adult skills. Paris: OECD Publishing. OECD. (2013b). PISA 2012 assessment and analytical framework. Paris: OECD Publishing. OECD. (2014). PISA 2012 results: Creative problem solving. Paris: OECD Publishing. Ohlott, P. J. (2004). Job assignments. In C. D. McCauley & E. Van Velsor (Eds.), The Center for Creative Leadership handbook of leadership development (pp. 151–182) (2nd ed.)). San Francisco: Wiley. Osman, M. (2010). Controlling uncertainty: A review of human behavior in complex dynamic environments. Psychological Bulletin, 136, 65–86. https://doi.org/10.1037/a0017815. Raven, J. (2000). Psychometrics, cognitive ability, and occupational performance. Review of Psychology, 7, 51–74. Raven, J., Raven, J. C., & Court, J. H. (1998). Manual for Raven’s progressive matrices and vocabulary scales: Section 4. The standard progressive matrices. Oxford: Oxford Psychologists Press. Ree, M. J., Earles, J. A., & Teachout, M. S. (1994). Predicting job performance: Not much more than g. Journal of Applied Psychology, 79, 518–524. https://doi.org/10.1037/0021-9010.79.4.518. Ree, M. J., Caretta, T. R., & Teachout, M. S. (2015). Pervasiveness of dominant general factors in organizational measurement. Industrial and Organizational Psychology: Perspectives on Science and Practice, 8, 409–427. https://doi.org/10.1017/iop.2015.16. Reeve, C. L., Scherbaum, C., & Goldstein, H. (2015). Manifestations of intelligence: Expanding the measurement space to reconsider specific cognitive abilities. Human Resource Management Review, 25, 28–37. Rhemtulla, M., Brosseau-Liard, P. É., & Savalei, V. (2012). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions. Psychological Methods, 17, 354–373. https://doi.org/10.1037/a0029315. Rocha, M. (2012). Transferable skills representations in a Portuguese college sample: Gender, age, adaptability and vocational
development. European Journal of Psychology of Education, 27, 77–90. https://doi.org/10.1007/s10212-011-0067-4. Rohrbach-Schmidt, D., & Hall, A. (2013). BIBB/BAuA employment survey 2012. In BIBB-FDZ data and methodological reports, Nr. 1/2013. Bonn, Germany: Federal Institute for Vocational Education and Training. Rönnlund, M., Sundström, A., & Nilsson, L. (2015). Interindividual differences in general cognitive ability from age 18 to age 65 years are extremely stable and strongly associated with working memory capacity. Intelligence, 53, 59–64. Sala, G., & Gobet, F. (2017). Does far transfer exist? Negative evidence from chess, music and working memory training. Current Directions in Psychological Science, 26, 515–520. Salgado, J., Anderson, N., Moscoso, S., Bertua, C., de Fruyt, F., & Rolland, J. P. (2003). A meta-analytic study of general mental ability validity for different occupations in the European community. Journal of Applied Psychology, 88, 1068–1081. https://doi.org/10. 1037/0021-9010.88.6.1068. Scherbaum, C. A., Goldstein, H. W., Yusko, K. P., Ryan, R., & Hanges, P. J. (2012). Intelligence 2.0: Reestablishing a research program on g in I-O Psychology. Industrial and Organizational Psychology, 5, 128– 148. https://doi.org/10.1111/j.1754-9434.2012.01419.x. Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124, 262–274. https://doi.org/10.1037/0033-2909.124.2.262. Schmidt, F. L., & Hunter, J. (2004). General mental ability in the world of work: Occupational attainment and job performance. Journal of Personality and Social Psychology, 86, 162–173. https://doi.org/ 10.1037/0022-3514.86.1.162. Schneider, B. (1987). The people make the place. Personnel Psychology, 40, 437–454. Schneider, W. J., & Newman, D. A. (2015). Intelligence is multidimensional: Theoretical review and implications of specific cognitive abilities. Human Resource Management Review, 25, 12–27. Schneider, B., Goldstein, H. W., & Smith, D. B. (1995). The ASA framework: An update. Personnel Psychology, 48, 747–773. Schooler, C., Mulatu, M. S., & Oates, G. (1999). The continuing effects of substantively complex work on the intellectual functioning of older workers. Psychology and Aging, 14, 483–506. https://doi. org/10.1037/0882-7974.14.3.483. Schweizer, F., Wüstenberg, S., & Greiff, S. (2013). Validity of the MicroDYN approach: Complex problem solving predicts school grades beyond working memory capacity. Learning and Individual Differences, 24, 42–52. https://doi.org/10.1016/j.lindif.2012.12.011. Sheridan, J. E., Slocum, J. W., & Buda, R. (1997). Factors influencing the probability of employee promotions: A comparative analysis of human capital, organization screening and gender/race discrimination theories. Journal of Business and Psychology, 11, 373-380. https:// doi.org/10.1007/BF02195900 Smart, E. L., Gow, A. J., & Deary, I. J. (2014). Occupational complexity and lifetime cognitive abilities. Neurology, 83, 2285–2291. https:// doi.org/10.1212/WNL.0000000000001075. Sonnleitner, P., Keller, U., Martin, R., & Brunner, M. (2013). Students’ complex problem-solving abilities: Their structure and relations to reasoning ability and educational success. Intelligence, 41, 289–305. https://doi.org/10.1016/j.intell.2013.05.002. Sorjonen, K., Hemmingsson, T., Deary, I. J., & Melin, B. (2015). Mediation of the gravitational influence of intelligence on socioeconomic outcomes. Intelligence, 53, 8–15. https://doi.org/10. 1016/j.intell.2015.08.006. Stadler, M., Becker, N., Gödker, M., Leutner, D., & Greiff, S. (2015). Complex problem solving and intelligence: A meta-analysis. Intelligence, 53, 92–101. https://doi.org/10.1016/j.intell.2015.09. 005.
Author's personal copy J Bus Psychol Stadler, M. J., Becker, N., Greiff, S., & Spinath, F. M. (2015). The complex route to success: Complex problem-solving skills in the prediction of university success. Higher Education Research & Development, 35, 1–15. https://doi.org/10.1080/07294360.2015. 1087387. Tharenou, P., Latimer, S., & Conroy, D. (1994). How do you make it to the top? An examination of influences on women's and men's managerial advancement. Academy of Management Journal, 37, 899931. https://doi.org/10.2307/256604. Thornton III, G. C., & Rupp, D. (2006). Assessment centers in human resource management: Strategies for prediction, diagnosis, and development. Mahwah, NJ: Erlbaum. Tomic, W. (1995). Training in inductive reasoning and problem solving. Contemporary Educational Psychology, 20, 483–490. https://doi. org/10.1006/ceps.1995.1036. UNESCO Institute for Statistics. (2012). International standard classification of education: ISCED 2011. Montreal, Quebec: UNESCO Institute for Statistics. Weinert, F. E. (2001). Concept of competence: A conceptual clarification. In D. S. Rychen & L. H. Salganik (Eds.), Defining and selecting key competencies (pp. 45–65). Seattle, WA: Hogrefe. Wiggins, J. S. (1973). Personality and prediction: Principles of personality assessment. Reading, MA: Addison-Wesley. Wilk, S. L., Desmarais, L. B., & Sackett, P. R. (1995). Gravitation to jobs commensurate with ability: Longitudinal and cross-sectional tests.
Journal of Applied Psychology, 80, 79–85. https://doi.org/10.1037/ 0021-9010.80.1.79. Wilk, S. L., & Sackett, P. R. (1996). Longitudinal analysis of ability-job complexity fit and job change. Personnel Psychology, 49, 937–967. https://doi.org/10.1111/j.1744-6570.1996.tb02455.x. Williams, J. E., & McCord, D. M. (2006). Equivalence of standard and computerized versions of the Ravens progressive matrices test. Computers in Human Behavior, 22, 791–800. Wonderlic. (2002). Wonderlic personnel test and scholastic level exam: User’s manual. Libertyville, IL: Wonderlic, Inc. World Bank. (2015). PPP conversion factor, GDP (LCU per international $). Retrieved from http://data.worldbank.org/indicator/PA.NUS. PPP Wüstenberg, S., Greiff, S., & Funke, J. (2012). Complex problem solving—more than reasoning? Intelligence, 40, 1–14. https://doi.org/ 10.1016/j.intell.2011.11.003. Zaccaro, S. J., Connelly, S., Repchick, K. M., Daza, A. I., Young, M. C., Kilcullen, R. N., & Bartholomew, L. N. (2015). The influence of higher order cognitive capacities on leader organizational continuance and retention: The mediating role of developmental experiences. The Leadership Quarterly, 26, 342–358. https://doi.org/10. 1016/j.leaqua.2015.03.007. Zinbarg, R. E., Revelle, W., Yovel, I., & Li, W. (2005). Cronbach’s α, Revelle’s β, and Mcdonald’s ωH: Their relations with each other and two alternative conceptualizations of reliability. Psychometrika, 70, 123–133. https://doi.org/10.1007/s11336-003-0974-7.