Using Crowdsourced Samples in Forensic Psychological Research

4 downloads 1354 Views 184KB Size Report
Behavioral forensic psychology is usually thought of as an applied ... interacting with the legal system than typical college ... Some graduate work or more. 15.30 ...
Abstract

Methods Participants Data collection was crowdsourced using Amazon’s Mechanical Turk website where people can partake in various tasks for payment. Participation was restricted to U.S. citizens who were 18 years of age or older. Materials & Procedure The data came from an eyewitness study comprised of two parts. See Table 1 for study Procedure in each part of the study.

Melissa A. Baker, Paul Fox, & Twila Wingrove Appalachian State University Attrition & Sample Characteristics

100% 80% 60% 40% 20%

     

Part 1 Demographic questionnaire Crime video of a home burglary Anagram filler-task Recall task Distraction rating Paid $0.20

Part 2  Demographic questionnaire  Recall task  Lineup identification task  Distraction rating  Paid $0.30

Note. Participants completed Part 2 of the study 14 days later.

34.15 33.95 33.75

Figure 3. Younger participants tended to continue to Part 2 of the study compared to older participants. Start of Study

Part 1

Table 2 Race/Ethnicity, Income, Employment, and Education

Part 2

Figure 1. N = 1,874 started the study; N = 1,491 at the end of Part 1; N = 588 at the end of Part 2. Sample Characteristics & Attrition In order to examine whether our final sample was different from our original sample, we compared demographic characteristics of participants in the initial sample ( Part 1) to those in the attrited sample ( Part 2). See Figures 2 and 3 and Table 2. Gender & Attrition 60% 55% 50% 45% 40% 35% Males

Table 1 Study Procedure

One obvious difference between laboratory- and crowdsourced-based research has to do with control of variables that might either confound a study or contribute to increased error variance. Because of this concern, we had participants answer a “distraction” question (Figure 4).

Females

Figure 2. There were more female participants in the study than male participants. More male participants attrited compared to female participants.

Race/Ethnicity White/Caucasian Black/African American Asians Hispanics/Latino Other Income Under $25,000 $25,000 - $74,999 $75,000 or over Employment Employed full-time Employed part-time Unemployed Student Homemaker Retired Education Completed high school or below Some college Two-year college degree Four-year college degree Some graduate work or more

Attentiveness Distraction Rating

Overall Sample Attrition

Attentiveness

Age & Attrition Age of Sample

Percent of Sample

Attrition The 1-4 day delay interval made an important contribution to attrition (Figure 1), which in research can bias a sample.

Percent of Sample

Crowdsourcing refers to the use of the Internet to recruit online contract labor. In psychological research, people are recruited online and paid to participate in research projects. Crowdsourcing is a research tool of particular interest to behavioral forensic psychologists. Behavioral forensic psychology is usually thought of as an applied science with a focus on the behavior of people interacting with components of the legal system. For forensic psychologists, one important advantage of crowdsourcing is that crowdsourced samples might be more representative of people interacting with the legal system than typical college student samples. We discuss this and other advantages and disadvantages of crowdsourced samples in the context of demographic data collected from 1,874 crowdsourced participants involved in a study of eyewitness memory. Substantial demographic information about the participants in the study allowed us to develop a more complete profile of those participants and attrition than is currently available in the published forensic psychology literature. The profile might help researchers plan their crowdsourced studies.

Using Crowdsourced Samples in Forensic Psychological Research

3 2.5 2

Part 1

Part 2

77.80% 9.70% 5.40% 4.70% 2.40%

75.3% 8.8% 6.2% 3.6% 2.40%

Figure 4. Distraction was rated on an 11-point scale (0 = not distracted at all, 10 = extremely distracted). Most participants rated their “distraction” as low. Average distraction ratings were lower in Part 2 than Part 1. Ratings were reliable, r(564) = .62, p < .001.

41.60% 46.80% 11.60%

42.70% 46.70% 10.60%

Additional Information

42.60% 16.80% 15.50% 12.20% 9.20% 3.80%

41.40% 14.30% 15.80% 14.30% 10.80% 3.40%

15.60% 28.70% 13.00% 26.60% 15.30%

15.30% 27.10% 12% 28% 17.60%

1.5

For more information regarding our crowdsourced sample and how our crowdsourced sample compares to other crowdsourced, university and community samples, please read our article: Baker, M. A., Fox, P., & Wingrove, T. (2016). Crowdsourcing as a forensic psychology research tool. American Journal of Forensic Psychology, 34(1), 37-50. To access our paper, Baker et al. (2016), online, please scan QR Code Paper. QR Code Paper

Discussion Sample Characteristics We hope our study encourages forensic researchers to use crowdsourcing as a valuable research tool. Our analyses indicate that crowdsourced samples are likely to be an adequate a source of clean interpretable data.

Attrition We believe that researchers who design studies that include delay intervals should collect demographic information about participants at the beginning and end of each study.

Our poster is also available online. In order to access our poster, please scan QR Code Poster. QR Code Poster

If you have any questions or would like additional information about our presentation, please feel free to email Melissa A. Baker (PI) at [email protected].