Surveytainment: A possible solution to declining survey data quality ...

3 downloads 2830 Views 259KB Size Report
collect data from local university students and then from an online. labor pool .... used by several marketing research companies—captures the ... For example,.
Business Outlook August 2015 (Volume 13, Issue 8)

Surveytainment: A Possible Solution to Declining Survey Data Quality Dr. Michael R. Hyman, NMSU Ms. Alena Kostyk, NMSU Mr. Wenkai Zhou, NMSU Several months ago, two of us were consoling each other about the poor quality of data we had collected on several surveys. In her recent two-stage study, Alena relied on one questionnaire to first collect data from local university students and then from an online labor pool (i.e., Amazon’s Mechanical Turk). Her questionnaire contained attention-check items meant to help her identify and then exclude careless responses from all statistical analyses. Generally, attention-check items are closed-ended (i.e., statements/questions followed by possible responses) items, embedded in a set of similarly formatted items, for which only one response is reasonable. For example, one can find a statement “You are currently reading this” in the middle of the questions set with answers varying from “Strongly Disagree” to “Strongly Agree”. Clearly, the only appropriate answer to such statement would be “Strongly Agree”– that is, if a respondent is truly attending to the questionnaire items. After analyzing almost 500 responses, Alena concluded more than 1/4th of study participants—both college students and mTurk workers—failed at least two attention check items. In the last decade, Mike has worked on many survey-based studies with NMSU doctoral students and colleagues at other universities. After noticing the declining quality of respondent data—based on standard assessment tools—Mike and frequent co-author Jeremy Sierra wrote an article about mischievous respondents. Such respondents are defined as ‘survey participants who provide phoney self-reports meant to bypass researcher skepticism’. The many reasons for their behavior, such as suspicions research findings will be applied nefariously, distrust of researchers and their work, general hostility towards business, opportunity to play a prank, and retaliation for previous privacy invasions and unethical survey practices, ensure mischievous respondents comprise a sufficient population to threaten the reliability of any survey (Hyman Sierra 2012). Page | 1

Other researchers also have questioned survey data quality (e.g., Buhrmester et al. 2011; Goodman et al. 2013; Sears 1986; Peterson 2001). Inattentive and disengaged respondents, of which mischievous respondents are a ‘type’, pose a costly problem to both academia and industry. Managerial decisions reliant on misleading statistical findings derived from erroneous data can cause irreparable harm to companies. Even if erroneous data can be identified and then excluded from analysis, time and money are wasted collecting and identifying data of subpar quality. Could the reason for declining survey data quality be ‘widely used data collection methods are outdated’? After all, the most iconic books on survey research methods (e.g., Dillman 1978; Payne 1951) were written in the pre-Internet, pre-social media, and preMillenials era. Psychologists and pedagogy researchers suggest the way people process information has changed dramatically in recent decades. The nouveau moniker ‘Net Generation’ describes current youths, who—based on some estimates—will spend 10,000 hours playing video games, 20,000 hours watching TV, and 10,000 hours using cell phones, by the time they reach 21 years of age (Barnes et al. 2007). This continual digital stimulation compromises people’s ability to focus their attention on a single entity. However, the “brain’s neural circuitry…ha[s] the capacity to create new pathways to process…hyperactive, attention-splitting data input. As a result, we are developing alternative ways to learn and think. In order to adapt, our brains are learning to access and process information more rapidly and also to shift attention quickly from one task to the next” (Small, Vorgan 2008, p.64). Multitasking has become integral to modern life. Most adolescents and young adults now report using multiple media simultaneously, e.g., surfing the Internet while playing video games, reading print media, listening to music, and chatting on the phone (Barnes et al. 2007). Whether an addiction or mere acclimatization, multitasking may be the primary cause for most college students’ self-reports about boredom in face-to-face classes, as traditional professorial lectures likely are monotonous for students accustomed to handling multiple stimuli (Barnes et al. 2007). Similarly, younger adults with ever-shrinking attention spans may find responding to traditional questionnaires a dull chore (Small Vorgan 2008). To address this issue, survey researchers could alter their traditional approach in administering lengthier questionnaires. One solution: embed ‘entertainment elements’ into questionnaires, especially ones administered online. Surveytainment—a moniker Page | 2

used by several marketing research companies—captures the interactive (e.g., games, puzzles) and non-interactive (e.g., videos, music) elements of questionnaires meant to enhance respondents’ enjoyment and engagement. Preliminary studies indicate surveytainment reduces straightlining (i.e., responding identically to consecutive closed-ended questions), improves response quality to open-ended questions (e.g., more words and valid responses), increases response time (and likely thoughtfulness of responses), enhances survey satisfaction, and boosts questionnaire completion rate (Schmidt et al. 2012). Exavo GmbH, a German marketing research company, claims "Respondents who enjoy the survey will give you more honest and useful responses than people who just trudge through for the incentive. With Exavo [SurveyStudio] your surveys can be fun" (http: //www.exavo.com/surveytainment-conduct-surveys-which-are-funfor-the-respondent.htm). Its surveytainment software includes the following data collection options: 

Chip Game: Dragging-and-dropping with cursor or touch screen, respondents allocate a predefined number of tokens to different items. Respondents prefer this graphical variant of the traditional constant sum scale.



Attribute Assignment Game: Respondents attribute words or 'tags' to items (e.g., concepts, brands). Each word or tag can be assigned to multiple items.



Scale Ranking: Respondents drag items across a scale depicted along a line. This method yields ordinally and intervally scaled data.



Tachistoscope: Respondents are exposed to an image (e.g., ad) for a limited time and then asked what they remember.



Shelf Test: Simulates a supermarket shopping trip in which respondents choose from stocked items. Respondents can select a limited number of items or spend a limited amount of money.



Card Drag and Drop: Respondents sort card images into predefined stacks. The number of items and stacks are study-dependent. Respondents may reassign items until the sorting task is completed.

For examples, visit http://www.exavo.com/surveytainment-conductsurveys-which-are-fun-for-the-respondent.htm. For a presentation Page | 3

of Insight Innovations' gamification surveytainment--which, for example, relies on timed response contests (i.e., quickest to respond wins), instant rewarding [modeled after games like Candy Crush], and delaying reward notification to subsequent participation in different survey--see http://insightinnovation.org/blog/2014/04 23/surveytainment-how-can-you-make-surveys-more-addictive/. Interactive surveytainment would enhance respondents’ personal involvement through game-like stimulators. Newspapers often use similar techniques to increase reader engagement. For example, the popular Word Jumble game serves as a change-of-pace to static and toneless news pieces. (This game is a dynamic puzzle that requires readers to unscramble letters and form words. The puzzle then designates certain letters within those words to form an answer to a riddle.) Although interactive surveytainment can exist in various forms, the Word Jumble example illustrates how a simple ‘brain workout’ could banish boredom from lengthy questionnaires and restore respondents’ attention. Unlike its interactive counterpart, non-interactive surveytainment would focus on creating sense (e.g., visual, acoustic) gratification via embedded stimuli instead of participative activities. Embedded videos and animations energize viewers and help them maintain memories about media content, which explains the increased digital stimuli in internet marketing. Examples often appear in 'About Us' sections of corporate websites (e.g., http://www.coca-cola company. com/our-company/infographic-coca-cola-at-a-glance). Similarly, we believe embedding video/audio elements in questionnaires will refresh respondents’ attention and increase their enjoyment. Although extending total response time, such elements could reduce careless responses. In addition to more trustworthy data, it could ease recruiting efforts by making study participation more attractive, reduce costs by lowering the remuneration sufficient to attract a sufficient number of qualified and careful respondents, and increase respondents retention rates for online or offline consumer panels. In a highly time-constrained world, it seems counterintuitive that the solution to poor quality survey data may require an increase in total response time. However, consumer researchers found interruptions can effect in some consumption contexts positively (Nelson, Meyvis 2008; Nelson et al. 2009). Generally, people’s enjoyment of an activity declines as they adapt to it (Frederick, Loewenstein 1999). Interruptions may disrupt adaptation and partially reset the baseline response to an activity (Lyubomirsky et al. 2005; Nelson, Meyvis 2008; Nelson et al. 2009; Shuchter, Zisook 1993). For example, Page | 4

most people try to avoid viewing ads embedded in commercial television programs, yet watching those commercials enhanced program enjoyment (Nelson et al. 2009). Realizing the full potential of surveytainment will require extensive empirical study. First, we must identify the ‘entertainment-to-time function relative to data quality’. As total response time increases, people eventually will prefer free time to additional surveytainment. What is the shape of this preference function? Although it seems likely a data plot would show an ‘inverted U shape’, discovering the exact shape (e.g., skewness and kurtosis) by context (e.g., psychologically threatening/highly personal versus mundane/impersonal questions; online vs. hard copy survey administration) would be helpful. Once we establish basic conditions for using surveytainment to boost data quality, we can fine-tune the entertainment elements for maximal efficacy. For example, we could study the efficacy of surveytainment relative to personal (e.g., generational cohort, personality type) and cross-cultural differences among respondents, use of elements either related or unrelated to the questionnaire topic, use of non-interactive (e.g., video) versus interactive (e.g., puzzle or quiz) elements, and the number and length of included elements. We hope to report on several empirical surveytainment studies in the coming months. References Barnes, Kassandra; Marateo, Raymond C.; Ferris, S. Pixy (2007): Teaching and learning with the net generation. Innovate: Journal of Online Education 3 (4), 1. Buhrmester, M.; Kwang, T.; Gosling, S. D. (2011): Amazon's Mechanical Turk: A new source of inexpensive, yet high-quality, data? Perspectives on Psychological Science 6 (1), 3–5. Dillman, Don A. (1978): Mail and telephone surveys. The total design method. New York, NY: Wiley. Frederick, Shane; Loewenstein, George (1999): Hedonic adaptation. In Kahneman, Diener, and Schwarz (Eds.), Well-being: The foundations of hedonic psychology. New York, NY: Russell Sage, 302–329. Goodman, Joseph K.; Cryder, Cynthia E.; Cheema, Amar (2013): Data collection in a flat world: The strengths and weaknesses of Mechanical Turk samples. Journal of Behavioral Decision Making 26 (3), 213–224.

Page | 5

Hyman, Michael R.; Sierra, Jeremy J. (2012), Adjusting selfreported attitudinal data for mischievous respondents. International Journal of Market Research 54 (1), 129-145. Lyubomirsky, Sonja; Sheldon, Kennon M.; Schkade, David (2005): Pursuing happiness: The architecture of sustainable change. Review of General Psychology 9 (2), 111–131. Nelson, Leif D.; Meyvis, Tom (2008): Interrupted consumption: Disrupting adaptation to hedonic experiences. Journal of Marketing Research 45 (6), 654–664. Nelson, Leif D.; Meyvis, Tom; Galak, Jeff (2009): Enhancing the television-viewing experience through commercial interruptions. Journal of Consumer Research 36 (2), 160–172. Payne, Stanley Le Baron (1951): The art of asking questions. Princeton, NJ: Princeton University Press. Peterson, Robert A. (2000): Constructing effective questionnaires. Thousand Oaks, CA: Sage Publications. Peterson, Robert A. (2001): On the use of college students in social science research: Insights from a second-order meta-analysis. Journal of Consumer Research 28 (3), 450–461. Schmidt, Sebastian; Muhle, Anna; Tress, Florian; Winkler, Till (2012): Surveytainment 2.0: Why investing 10 more minutes more in constructing your questionnaire is worth considering. In: Kaczmirek, Irmer, Hellwig et al. (Eds.), 14th General Online Research Conference, Baden-Wuerttemberg Cooperative State University Mannheim. Sears, David O. (1986): College sophomores in the laboratory: Influences of a narrow data base on social psychology's view of human nature. Journal of Personality and Social Psychology 51 (3), 515–530. Shuchter, Stephen R.; Zisook, Sidney (1993): The course of normal grief. In Stroebe, Stroebe, and Hansson (Eds.), Handbook of bereavement: Theory, research, and intervention. Cambridge, UK: Cambridge University Press, 23–43. Small, Gary; Vorgan, Gigi (2008): iBrain: Surviving the technological alteration of the modern mind. New York, NY: Harper Collins.

Page | 6

About the Authors Dr. Michael R. Hyman is Distinguished Achievement Professor and Ph.D. Coordinator of Marketing at NMSU. He is Executive Editor of NMSU Business Outlook and Marketing Ethics Section Editor for Journal of Business Ethics. Attesting to his writing compulsion are more than 80 academic journal articles, 50 conference papers (10 which won a ‘best paper’ award), four co-authored/co-edited books, 30 other academic contributions, and 50 non-academic works. He is known for his collection of Looney Tunes shirts, inability to chip a golf ball correctly, encyclopedic knowledge of classic Hollywood movies, overly neat office, and loyalty to the New York Yankees. Ms. Alena Kostyk is a doctoral student in marketing at NMSU. She graduated with a M.B.A. from Michigan State University, where she received the Broad Warrior award for being one of the top five students in her class. After cold Michigan winters, she decided to move to a much warmer climate. Now she studies consumer behavior and enjoys mountain hiking with her Welsh corgi. Mr. Wenkai Zhou is a marketing doctoral student at NMSU. He received a B.A. in Business Administration (Marketing) from Eastern Washington University and a M.B.A. from University of California, Riverside. He leverages his unique background in marketing and intercultural communication in pursuit of his primary research focus on cross-cultural marketing studies between Asian and Western populations. A travelholic and culture lover, he is eager to start a trip around the world soon.

Page | 7

Suggest Documents