Document not found! Please try again

Game Show - Semantic Scholar

1 downloads 0 Views 170KB Size Report
results of this quasi-experiment using five sections of an upper-division MIS (Management Information Systems) survey course spanning three academic terms ...
Journal of Information Systems Education, Vol 12(2)

The Effectiveness of Computer-Based “Game Show” Formats in Survey Courses: A QuasiExperiment Alan A. Brandyberry [email protected] Department of Management and Information Systems Kent State University Kent, Ohio 44242, USA J. Harold Pardue [email protected] School of Computer & Information Sciences University of South Alabama Mobile Alabama 36688 ABSTRACT The confluence of computers and integrated projection systems in the classroom has opened new avenues for course content delivery in an active learning format. This paper first discusses the concepts of active learning and play in a pedagogical context. Next, the implementation and subjective results of a generic computer-based game show for delivering course content in introductory survey courses is presented. This paper then describes the employed methodology and statistically tests certain aspects of the course related to the effectiveness of this implementation. The results of this quasi-experiment using five sections of an upper-division MIS (Management Information Systems) survey course spanning three academic terms strongly support the research hypotheses that the game show format increases student learning and improves student perceptions of the overall quality of the course. The implications of this research for educators are discussed. The game show application was developed by the authors and is available for download as freeware. Keywords: Classroom technology, course content delivery, computer-based, game show, survey course, play, education.

1. INTRODUCTION Introductory survey courses have characteristics that pose special challenges to the instructor in keeping students interested and engaged. Students often enroll in these courses primarily to satisfy graduation requirements rather than to satisfy an inherent interest in the subject. In addition, these courses are often very vocabulary oriented. Finally, because of the vast amount of information that must be covered in a survey course, breadth is often emphasized over depth.

In recent years a general call to move from heavy reliance on lecture-based instruction to a richer, more active, technology enhanced learning environment has emerged (Shapiro 1998; Benjamin, et al 1999). Nowhere is this call to action more needed than in introductory survey courses. Collectively these courses occupy a large number of credit hours in the curriculum and often constitute the core. The importance of effective teaching techniques in these courses cannot be underestimated.

109

Journal of Information Systems Education, Vol 12(2)

As computers and integrated projection systems become more common in the classroom, the opportunities for implementing more interactive methods of content delivery are increasing. This paper describes and tests the effectiveness of a computer-based game show format for course content delivery. The contribution of this paper is twofold. First, it provides a useful example of implementing classroom technology to create an active learning environment in introductory survey courses. Second, the effectiveness of this format is tested. Although the use of advanced technology in the classroom is generally considered positive, care must be taken to document the effectiveness of these implementations. It is tempting to place the hurdle to justify the use of classroom technology at the level of “do no harm,” but there are considerable costs associated with acquiring and maintaining classroom technology that should be weighed against the benefits. The importance of careful and appropriate implementation can be demonstrated with an analogy from industry. Many reported failures during the early years of implementing computer technology in industry have been attributed to “automation for automation’s sake” or implementing computer technology simply because it is available. These failures due to inappropriate implementation can cast unjust doubt on the benefits of these computer technologies (Melnyk & Narasimhan 1992).

Although many of these shows are widely distributed throughout the international community (Jeopardy is available in 43 countries), the details of these games may not be familiar to all. The majority of the game shows discussed in this paper have World Wide Web sites available to those who may want to investigate the details of these show formats. A popular game show format for general content delivery is Jeopardy. This format has been used for geometry, chemistry, and social studies (Saunders 1987; DeChristopher 1991; Fisher 1996). The game show format To Tell the Truth has been used in teaching literature and medicine (Brown-Guillory 1988; Hafferty 1990). Wood (1992) uses a game show format similar to The Price is Right to teach the concept of probability. Daigle and Doran (1998) use a college bowl format to teach computer history.

2. BACKGROUND Psychologists, anthropologists, sociobiologists, historians, and educators have thoroughly researched the nature of play and its relationship to learning (Rogers and Sawyer 1988). Play is nearly universal among mammals. Studies show that play stimulates the growth of synapses and through practice enables us to stabilize our learning (Carvey 1977; Wilson 1978). One definition of play is “something that is fun but purposeful” (Mann 1996). Because it is fun, play is an intrinsically motivating activity. “Educational” games are fun but purposeful. Games, as a form of play, provide a means of practicing skills with reduced consequences. For example, understanding can be tested and refined without the risk of receiving low marks on an exam. Games can take many forms. This study explores an educational adaptation of the television-based “game show” format.

Pedagogical research stresses the importance of active learning (Association of American Colleges 1986; Astin 1984; Miller 1988). The major premise of this paper is that games are an effective form of active learning because they engage students in the process of content delivery. Students are active participants in the process, not passive vessels receiving knowledge. The element of fun makes games a powerful form of learning because they are intrinsically motivating. Witness the ubiquitous juxtaposition of “fun” and “learning” in the promotional materials for educational software.

A review of the pedagogical literature reveals several examples of using game show formats taken mainly from American television to teach particular concepts or as a generic vehicle for course content delivery.

110

Journal of Information Systems Education, Vol 12(2)

so that everyone in the class can consider the question. When the “Pick Player” button is clicked, a student is randomly selected. The randomizing process weights the probability of a student being selected according to the number of questions they have previously been asked. This results in the situation where a student who has received fewer questions has a greater chance of receiving the next question but all students have some chance of selection. If the selected student is absent, this is recorded by clicking the absent button and another student is selected. The student then attempts to answer the question before time expires. The instructor then displays the answer and determines if the student’s answer is correct, partially correct, or incorrect. The judgement is recorded in the database, and the next question is displayed.

3. THE CLASSROOM GAME SHOW The concept of television game shows was used as a basis to develop and implement a computer-based learning tool for course content delivery. A discussion of the format, software, implementation, and classroom experiences follows. In addition, the goals of this implementation are used to develop hypotheses to test the effectiveness of this effort. 3.1 Description of Game Show Format The game show and design of the software (Brandyberry and Pardue 2001) was based on a generic question/answer format. Among current television game shows it is most related to Jeopardy but lacks its peculiarities such as phrasing the answers in the form of a question.

In order to add to the feeling of “play,” a light-hearted atmosphere is maintained. Although the instructor is central to this, certain features of the software help maintain the atmosphere. For instance, when the

To begin, current students, questions and answers are entered into a database via manual entry or by uploading a comma-delimitated text file (easily created from a spreadsheet file). During the game, the instructor clicks the “Pick Question” button to randomly select a question from the selected categories (often text chapters) to be covered that session (see Figure 1 – note that the actual program makes extensive use of color to maintain atmosphere). The question is displayed before a student is selected

111

Journal of Information Systems Education, Vol 12(2) incorrect response). However, the entertainment aspect of the format was also perceived to be a major influence. The comments on course evaluations (objective course evaluation results are analyzed in a later section) were very positive. Of students who choose to make optional comments concerning the game show format, all but one was positive. Most comments were related to the format being a fun alternative to traditional activities and that the game show aided them in motivating themselves to keep up with the assigned material.

“Correct!”, “Partial Correct”, incorrect (“Sorry”), or “Absent” buttons are clicked, the program randomly plays sound files selected for each button. These could be as simple as buzzer and chime sounds but the use of sound clips from popular television shows and movies are especially effective. The software allows substantial customization. In addition to changing students and questions, the instructor can change between multiple courses, course sections, and texts. Different categories or chapters may be selected with multiple selections simultaneously used in one session. Questions can be asked sequentially by question number or in a random fashion. This allows for a structured format where topics are introduced in a logical sequence or a randomized review format. The score values for correct, partially correct, incorrect, and absent can be modified as can the time allowed for each response. Sound files to be played for certain actions can be added simply by storing them in a specified file folder.

3.3 Hypotheses In addition to describing the game show format and its implementation, an objective of this research was to measure and test its effectiveness. The goals of the implementation were to improve the level of student learning and to improve the students’ perception of the course. These goals were used to determine the effectiveness of the implementation. The measurement of student learning was operationalized as student performance on exams and the measurement of students’ perceptions was operationalized as the results of course evaluations. It is important to note that the concept of “learning” is complex and exam scores provide a limited measure of this. This resulted in the formulation of two hypotheses.

3.2 Implementation and Classroom Experiences Certain aspects of the implementation of this format in the classroom do not involve the software and is at the instructor’s discretion. For instance, may the students consult their books or notes? For the MIS survey course used for the pilot implementation of this format, it was decided that students were permitted to consult only handwritten notes of their own creation. The text, printed lecture notes, photocopies of notes, and other related materials were not permitted. In addition, the instructor may decide to include “minilectures” or discussions between questions. A question often introduces a topic but doesn’t fully explain it. Where this occurred, the instructor provided additional information before the next question was asked. Questions were both definitional and conceptual and could be answered in a sentence or two in an open-ended format. Multiple-choice and other objective formats were not used but would be simple to implement with the software.

H1: Treatment group will display greater learning than control group H2: Treatment group will have a more positive perception of the course than control group The next section describes the methodology used to test these hypotheses. This is followed by a discussion of the data analysis and results. 4. METHODOLOGY This study used a quasi-experimental design. A selfcritical use of this design is recommended where the experiment is conducted in the field and “randomized treatments are not possible” (Campbell & Russo 1999, p. 81). Although great care was taken to isolate the treatment effect, it must be recognized that in the classroom numerous subtle factors can influence outcomes. Because actual classes were used, students self-selected into one group or the other through normal registration procedures and were not randomly assigned. To reduce any effects of this potential source of sample bias, students were not informed of the treatment prior to or during registration. This information was not disclosed until the first day of class in the context of the syllabus. A total of five course sections of an upper-division MIS survey

The instructor’s subjective assessment of this implementation is entirely positive. The students appeared to be very engaged in the process and most appeared to be entertained. In addition, it was also a greatly improved experience for the instructor, both from an entertainment perspective and from the satisfaction derived from seeing students enjoying the learning process. This assessment is supported by dramatically increased attendance and in student comments on course evaluations. Attendance was undoubtedly partially stimulated by the use of extra credit points to reward the top performers (being absent when your name comes up equated to an

112

Journal of Information Systems Education, Vol 12(2) group. Hypothesis 1 was tested using a one-tailed ttest and a factorial Type III sum of squares test. Hypothesis 2 was tested using a one-tailed t-test. The following section describes the data analysis and summarizes the results.

course spanning three academic terms (to minimize any effects of the non-random selection process) were included in the study. Because the MIS field changes so rapidly, the course content and materials could not be held constant for more than three terms. One approach to controlling possible sample bias is to compare the sample grade point average (GPA) mean for each group to the population GPA mean. Because of privacy issues, this information was not available for the subjects and the control could not be implemented. However, the instructors perceived no reason to suspect that the groups were not academically representative of the population. Additionally, it is believed that the variety of scheduled course offerings further reduced any possible effects of the nonrandom selection process.

5.1 Student Learning Arithmetic means were calculated for exam scores by treatment group and instructor (see Table 1). The means reveal that the treatment group consistently scored higher than the control group on all three exams (5.9%, 4.5%, and 4.7% respectively). Before examining the statistical significance of the increase, the effect of instructor bias was tested. A simple t-test conducted on the three exam scores between instructors for the control groups showed no significant difference (minimum P>t = 0.6610).

4.1 Subjects The subjects in this study were undergraduate, business students who had enrolled in the upperdivision MIS survey course at a regional campus of a state university. The course is required of all business students.

The significance of the treatment effect was tested in two ways, a one-tailed t-test between groups and a factorial Type III sum of squares test. The results of the t-tests (shown in Table 2) confirm that the treatment group scored significantly higher than the control group on all three exams.

4.2 Experimental Design This experiment involved two instructors (both at the Assistant Professor rank with substantial experience teaching the course) and five course sections spanning three consecutive academic terms. Course sections were divided into three control groups and two treatment groups. To minimize effects due to differences in teaching style, both instructors used a common set of course materials including the same texts, lecture notes (PowerPoint slides), syllabus, supplementary materials, and exams. The only difference was the addition of a description of the game show in the treatment group syllabus.

To further test the possible influence of instructor bias, a factorial design was employed and an Fstatistic computed for instructor treatment. A Type III sum of squares test, SS(Game Show | Instructor), is appropriate since it measures the “extra” effect of the treatment with the instructor effect accounted for (Montgomery 1997, p. 164). The results of these tests are shown in Table 3 and confirm that there are significant differences on the test scores attributable to the game show treatment with the instructor treatment controlled for. In addition, it is confirmed that the instructor did not significantly affect exam scores. 5.2 Student Perceptions of the Course To measure the change in student perception due to the treatment effect, the difference between the item means on end-of-term course evaluations for the treatment and control groups were computed for the instructor who was involved with both treatment and control groups. All 26 items reflected a nominal positive change, 17 were statistically significant (see Table 4). The Likert scale questions employed are actually ordinal measures. However, it is common and has been shown to be fairly robust to treat these as interval in analysis (Emory & Cooper 1991, p. 222). Therefore, this approach is taken so that means and ttests may be employed and the clarity of the results maintained. The results strongly support the hypothesis that the treatment groups had a more positive perception of the course than the control group. Although not all questions address issues logically linked to the game show format, the increase

4.3 Dependent Variables The dependent variables were student scores on three standardized multiple-choice exams and student responses on standardized course evaluation forms. The objective format of the exams eliminated grading bias. Students were not informed that identical exams were used across sections and care was taken to maintain a secure exam environment. There was no overlap between the exact game show questions and the exam questions. The course evaluation instrument was developed at the University level to measure students’ overall perceptions of a course, course content, instruction methods, and the instructor. 5. DATA ANALYSIS AND RESULTS In general, we expected the treatment group to display significantly greater learning and have a significantly more positive perception of the course than the control

113

Journal of Information Systems Education, Vol 12(2) seen in all areas can be attributed to an overall improved perception of the course. Questions that dealt primarily with classroom delivery universally displayed strong positive change. Most encouragingly, the questions

Instructor 1

Instructor 2

Table 1. Exam scores for treatment and control groups. Control Treatment Exam 1: 74.4%; n = 102 Exam 2: 76.5%; n = 99 Exam 3: 70.0%; n = 102 Exam 1: 75.1%; n = 50 Exam 1: 80.5%; n = 55 Exam 2: 76.1%; n = 50 Exam 2: 80.9%; n = 55 Exam 3: 69.2%; n = 49 Exam 3: 74.4%; n = 55

Table 2: T-test results for differences on exam scores. a Treatment Control Treatment Mean Mean Sample Size Exam 1 80.45 74.62 55 Exam 2 80.92 76.39 55 Exam 3 74.44 69.73 55 a α is set at 0.05 for all tests in this study

Control Sample Size 152 149 151

T-value 4.2303 2.6906 2.8638

P>T 0.0000 0.0039 0.0023

Table 3. Factorial analysis results (Type III SS) for differences on exam scores. a F-value P>F Game Show Treatment 9.87 0.0019 Exam 1 Instructor Treatment 0.19 0.6639 Game Show Treatment 5.32 0.0221 Exam 2 Instructor Treatment 0.06 0.8142 Game Show Treatment 6.51 0.0114 Exam 3 Instructor Treatment 0.19 0.6633 a The F-statistic, analogous to a two-tailed t-test in the hypothesis tested, is less powerful than the one-tailed t-test in Table 2. The resulting p-values are expected to be approximately twice the p-values found in the t-tests above without any effect from the instructor treatment.

Table 4. Significant results (α = 0.05) for tests for changes in mean evaluation scores. Question 1 2 3 4 5 6 7 8 9 10 11 12 13 14

The instructor presented challenging and stimulating material The instructor inspired interest in the subject The instructor displayed enthusiasm in teaching the subject The instructor motivated me to do my best work The instructor used examples and illustrations effectively The instructor explained what is expected of students The instructor was an effective speaker The instructor maintained an atmosphere in the class that encouraged learning The instructor made clear how my work was to be evaluated The instructor gave helpful feedback on my performance The instructor provided students with the opportunity to answer questions The material was summarized in a manner that helped me learn The instructor encouraged students to participate in class discussions and/or activities The work assigned to be completed outside of class contributed to my

114

Treatment Mean – Control Mean 0.43 0.76 0.94 0.55 0.54 0.48 0.61 0.44 0.52 0.58 0.61 0.50 1.15 0.61

T-Value 1.78 2.72 3.69 2.10 1.80 1.80 2.04 1.72 1.82 2.24 2.07 1.71 4.03 2.06

Journal of Information Systems Education, Vol 12(2)

15 16 17

understanding of the subject matter The course as a whole was good Overall, the instructor presented the subject effectively Overall, I learned a lot in this course

0.72 0.82 0.92

2.83 3.41 3.64

addressing the overall perception of the effectiveness of the course and instructor all showed significant positive changes (e.g. questions 15, 16, 17).

profession by developing and making available applications unlikely to be developed by commercial enterprises.

The results of the data analysis show that the treatment group (game show format) scored higher on all three exams and evaluated the course more positively than the control group. These results support the hypotheses of greater learning and more positive course perceptions in the treatment group.

8. REFERENCES Association of American Colleges, 1986, “A New Vitality in General Education.” Task Group on General Education. Astin, A. W., 1984, “Student Involvement: A Developmental Theory for Higher Education.” Journal of College Student Personnel, 22, pp. 297-308. Benjamin L. T., B. Nodine, R. Ernst and C. BlairBroeker (Eds.), 1999, Activities handbook for the teaching of psychology, Vol. 4. American Psychological Association, Washington, DC, USA. Brandyberry, A. A. and J. H. Pardue, 2001, “Classroom GameShow” (Software). [http://babbage.bsa.kent.edu/Gameshow.htm]. Brown-Guillory, E., 1988, “Integrating Television Game Shows and Reader-Response Criticism.” Exercise Exchange, 34(1), pp. 42-43. Campbell, D. T. and M. J. Russo, 1999, Social Experimentation. Sage Publications, Thousand Oaks. Carvey, C., 1977, “Play and Learning.” In B. Tizard and D. Harvey (Eds.), The Biology of Play. Lippincott, Philadelphia, pp. 74-99. Daigle, R. and M. V. Doran, 1998, “Facilitating Bloom’s Level 1 Through Active Learning and Collaboration.” JISE, 9(3), pp. 3-6. DeChristopher, M., 1991, “Scientific Jeopardy: Recall Made Palatable.” Science Activities, 28(3), pp. 35-37. Emory, C. W. and D. R. Cooper, 1991, Business Research Methods. 4th Ed., Irwin, Homewood, IL. Fisher, M. E., 1996, “Let’s Play Jeopardy! Today’s Topic: The Electoral College.” Update on LawRelated Education, 20(3), pp. 37-39. Hafferty, F. W., 1990, “To Tell The Truth: An InClass Learning Exercise for Medical Students.” Teaching Sociology, 18(3), pp. 329-336. Mann, B. L., 1996, “Serious Play.” Teachers College Record, 97(3), pp. 447-469. Melnyk, S. A. and R. Narasimhan, 1992, Computer Integrated Manufacturing: Guidelines and Applications from Industrial Leaders. Business One Irwin, Homewood, IL.

6. LIMITATIONS As with any empirical study, there are limits to the degree results can be generalized to a broader population. This study examined only one game show format in only one MIS introductory survey course. To generalize to all game show formats and introductory survey courses, a cross-format, crossdisciplinary study would be needed. Although exams are among the most common methods used to evaluate the level of learning demonstrated by a student, they are an imperfect and incomplete measure of learning. The higher exam scores achieved by the students exposed to the game show format certainly support the inference that these students learned more, however, a more exhaustive study including a detailed analysis of all aspects of learning would be necessary to make a definitive statement. Finally, the quasiexperimental format did not permit a blind study from the perspective of the instructors. Further, only one instructor used the treatment in class. These design limitations are moderated by there being a strong career-oriented motivation on the part of the instructors to have any class (treatment or control) show positive results in both student outcomes and course evaluations. 7. CONCLUSION The findings of this study support the hypotheses that students who participate in a game show format will perform significantly better on exams and will have a more positive perception of the course. This suggests that active learning in the form of well-structured purposeful computer-based games can be an effective vehicle for the delivery of survey course content. One implication of this study is the observation that the increase of computing technology in the classroom provides the more technically capable educators an opportunity to add to the tools available to their

61

Journal of Information Systems Education, Vol 12(2) Statistics). His current research interests include trust in e-commerce environments, web-based HCI, open source software development, and CIS education and curriculum development. His articles have been published in the Journal of Information Systems Education, the Journal of Computer Information Systems, the Journal of Psychological Type, System Dynamics Review, the Engineering Economist, and the Journal of Engineering Education. His articles have been published and presented at The Americas Conference on Information Systems, Decision Sciences Institute, Winter Simulation Conference, and Association for Computing Machinery Conference. His general teaching areas are in e-commerce, database, and thin-client development.

Miller, G. E., 1988, The meaning of general education: The emergence of a curriculum paradigm. Teachers College Press, Colombia University, New York. Montgomery, D. C., 1997, Design and Analysis of Experiments, 4th Ed. John Wiley & Sons, New York. Rogers, C. S., and J. K. Sawyer (Eds.), 1988, Play in the Lives of Children. National Association for the Education of Young Children, Washington, DC. Saunders, H. M., 1987, “Place Your Geometry Class in ‘Geopardy.’” Mathematics Teacher, 80(9), pp. 722-725. Shapiro, A. M., 1998, “Promoting Active Learning: The Role of System Structure in Learning from Hypertext.” Human Computer Interaction, 13(1), pp. 1-35. Wilson, E. O., 1978, On Human Nature. Harvard University Press, Cambridge. Wood, E., 1992, “Probability, Problem Solving, and ‘The Price Is Right.’” Mathematics Teacher, 85(2), pp. 103-109.

Dr. Alan Brandyberry is an Assistant Professor of Information Systems in the Department of Management and Information Systems at Kent State University. His current research interests include the adoption of information technology, electronic commerce facilitated supply-chain linkages, and justification of information technology investments. His research has been published in journals such as Decision Sciences, The International Journal of Technology Management, Industrial Mathematics, and the Malaysian Journal of Management Science. His teaching interests include database design and development, application and systems development, and electronic commerce.

Dr. Harold Pardue is an Associate Professor of Computer and Information Sciences in the School of Computer and Information Sciences at the University of South Alabama. He received his Ph.D. in MIS from the Florida State University in 1996 (support area in

62

Journal of Information Systems Education, Vol 12(2)

63

Journal of Information Systems Education, Vol 12(2)

Call for Submissions Refereed Articles The Journal of Information Systems Education is seeking original articles on current topics of special interest to IS Educators and Trainers. Research or application oriented articles that describe curriculum, pedagogy, professional development or educational facilities issues will be considered for publication in the journal. Possible topics include: course projects/cases, lecture materials, curriculum design and/or implementation, workshops, faculty/student intern/extern programs, hardware/software selection, industry relations, etc. All manuscripts will be refereed by a rigorous evaluation process involving at least two blind reviews by qualified academic, industrial, or governmental computing professionals. Submissions will be judged not only on the suitability of the content, but also on the readability and clarity of the prose. All manuscripts selected for publication must maintain a high standard of content, style and value to the readership. An important criterion for acceptance of a manuscript for publication is the relevance of the work to the educational/training environment and its potential usefulness for advancing the quality of IS education or training.

Submission of Manuscripts The Journal of Information Systems Education will accept only original contributions that have not been previously published or submitted elsewhere for review/publication. In an effort to reduce the reviewing cycle time, JISE is moving to electronic submissions on all submissions. Electronic submissions are preferred To submit electronically, please include a Word or RTF file of your submission as an e-mail attachment addressed to [email protected]. In the body of your e-mail message include the author(s) name(s), contact information for the corresponding author, and the title of your submission. Your submission will be acknowledged promptly via return e-mail. All contributions must be submitted in English and will be printed without charge. Prior to publication, authors will be required to transfer copyright and sign a Transfer of Copyright Agreement form, sent with the acceptance letter. The copyright agreement enables the Association for Information Technology Professional's Special Interest Group for Education to protect the author's copyrighted material. Authors reserve all proprietary rights, as well as the right to reproduce and use all or portions of their manuscripts in future works. Manuscripts should include an abstract (200-350 words), a list of 2-7 keywords to identify the subject content, body of text divided by section headings, concluding remarks, and reference endnotes. Reference endnotes should appear at the end of the article, sequenced in the order cited in the article. The paper should not normally exceed 12 double-spaced pages, including all sections, figures, tables, etc. However, EDSIG reserves the right to consider longer articles of major importance to the IS field.

Call For Teaching Tips The Journal of Information Systems Education is soliciting teaching tips for a regular column on "Teaching Tips." Contributors will be acknowledged by having an edited version of their tips with their name and affiliation published in the Journal. Tips should be from one paragraph to two pages long and should address the contributor's experience using the tip (both what works and what didn't work). All contributions should be directed to Dr. Albert L. Harris, Editor; JISE Department of ITOM Appalachian State University, Boone, NC 28608 Phone (828) 262-6180 Fax (828) 262-6190

64

Journal of Information Systems Education, Vol 12(2)

65

Journal of Information Systems Education, Vol 12(2)

Author Guide for Preparing Your JISE Paper Author's Name E-mail address, if desired University Department, University Name City, State Zip, Country and Second Author's Name E-mail address, if desired Group, Company, Address City, State Zip, Country ABSTRACT The abstract should summarize the content of the paper. Try to keep it below 250 words and do not include equations or references in it. If you use abbreviations in the paper, e.g., CIS, use the full meaning of the abbreviation in the abstract, e.g., Computer Information Systems. The manuscript should be printable on 8.5" x 11" paper. All diagrams, figures, etc. must be in black and white only since the Journal does not have the ability to generate colors. Depending on our ability to use your electronic format, it is not possible to use the printed copy; therefore, the electronic readability of copy is of critical importance. Keywords: Author guide, manuscript, camera-ready format, instructions for authors, paper specifications

1. IMPORTANT INFORMATION 1) The paper typically should be limited to no more than twelve (12) pages. This includes the bibliography, figures, diagrams, tables, appendices, etc. Contact the editor if your paper exceeds 12 pages.

2)

The final revised paper should be sent via email directly to the editor at [email protected].

3)

2. PREPARATION OF MANUSCRIPTS 4) 2.1 General Appearance The text of the Journal is English. Your paper must not contain corrections nor should it contain page numbers, headers or footers. This document is printed in the format that should be used in the paper.

5)

On the first page, the distance from the top edge of the paper to the top of the first line of type (the title) should be 1.5 inches. The title should be in 18point type. On the second and subsequent pages, the distance from the top edge of the paper to the top of the first line of type should be I inch. The text is to be in two columns, with a 0.3-inch column separation. The width of each column should be about 2.85 inches. The bottom margin should be 1.25 inches on all pages. This is to allow for the insertion of page numbers. Left and right margins should be 1.25 inches. 3. RECOMMENDED POINT SIZES

On 8.5" x 11" paper (the Journal size) we require a font size of 9 points. Some technical formatting programs print mathematical formulas in italic type, with subscripts and superscripts in a slightly smaller font size. This is acceptable. If the font produces a paper that is difficult to read, we may apply a slightly larger font size to your paper.

You should use Times New Roman typeface (font) for the paper. 2.2 Specifications The printing process may require that the paper be photographed. To ensure that all papers give an appearance of consistency and uniformity, you should adhere to the following specifications.

66

Journal of Information Systems Education, Vol 12(2)

4. HEADINGS 6. ACKNOWLEDGEMENTS Major headings are to be column centered, numbered, in a capitalized bold font as shown in this document.

Acknowledgements should follow the text just before the references.

4.1 First-level Sub-headings First level sub-headings are to be numbered, in a bold font, and run in at the beginning of the paragraph. No blank line is to appear between the sub-heading and the test.

7. FOOTNOTES Footnotes should be typed in single-line spacing at the bottom of the page and column where it is cited. Footnotes should be rare. Note the exception on the first page with the placement of email addresses.

Sub-subheadings: Sub-subheadings are to be in a bold font and run in at the beginning of the paragraph.

8. CONCLUSIONS

Title: The title should be centered across the top of the first page and should be of point size 18.

The enhancements to information systems education is identified or repeated here. DO NOT repeat the abstract or portions of it.

4.2 Authors’ names and Addresses The authors’ names and addresses should be centered below the title. These lines should be in 12-point size. If two or more authors are from the same institution, list the individual author names above the common affiliation.

9. REFERENCES Ashby, W. Ross [1956], An Introduction to Cybernetics. Methuen Press, London. Dumdum, U. Rex and William I. Tastle [1998], "Towards a Broader Competency-Based IS Education: A proposed Improvement Package for Analysis of Case Studies." Proceedings of ISECON'98, October 15-18, pp. 28-33.

4.4 Keywords Select four to seven keywords that capture the essence of the paper. List the words in decreasing order of importance from left to right. 4.5 References List all references at the end of the paper in alphabetical order by primary author last name. When citing references in the text, type the last name and date. The citation should appear in parentheses such as is shown here (Ashby 1956). Groups of citations should appear in a single set of parentheses separated by semi-colons. If two or more papers are cited that were written by the same author, separate the years with commas.

Notice that papers are enclosed in quotation marks. The primary author is listed by last name, followed by secondary authors, who are listed by first name. In the event all authors are co-equal, list them in alphabetical order. These references utilize the hanging indent format. No space line is to be placed between references. 10. AUTHOR BIOGRAPHIES

Illustrations: All halftone illustrations should be original clear black and white prints. Do not supply copies. Show the illustration in the manuscript, but include an individual print of the illustration (figure, table, etc) on a separate page, just in case!

The authors are encouraged to submit brief biographies and a black and white passport type photo for inclusion in the journal. Pictures should be in .JPG format.

5. FORMULAE

Author biography and photograph (head only) are printed at the end of the article but before any attachments or appendices.

All equations must be typed utilizing a suitable processor. They should be numbered consecutively throughout the text, at the flush-right edge of the equation, enclosed in parentheses.

67

68

Individual Subscription Request Please enter my subscription for the Journal of Information Systems Education. Name ________________________________________________________________________ Title ____________________________________Telephone (______) ____________________ Mailing Address _______________________________________________________________ _______________________________________________________________ City ________________________________State ______________ Zip Code ______________ Country ____________________Fax (_____) ________________ E-mail _______________________________________________________________________ Please check the appropriate categories: This subscription is a renewal new order Prices (checks or money orders in U.S. dollars, drawn on a U.S. bank) Outside the United States Within the United States (includes Air Mail Postage) Annual Individual Subscription-$50.00 Annual Individual Subscription-$65.00 Date Subscription Begins:

Next Issue First issue after __________________________

Payment by check or money order in U.S. dollars must be included. Amount enclosed: $ ______________ Send to: Dr. Albert L. Harris, Editor JISE Department of ITOM, Raley Hall Appalachian State University Boone, NC 28608 U.S.A. All e-mail inquiries should be sent to: [email protected]

69

70

Library Recommendation (Please Print, complete and forward this form to your Librarian)

Dear ____________________________________ (librarian ' s name ) I recommend that ____________________________________________(library's name) subscribe to the following publication. [ ] Journal of Information Systems Education (JISE) ISSN 1055-3096 (US$ 100.00/Year) I have indicated the benefits of the above journal to our library: (1 = highest benefit; 2 = moderate benefit; 3 = little benefit) 1 2 3 REFERENCE: For research articles in the field of information systems. 1 2 3 STUDENT READING: I plan to recommend articles from the above to my students. 1 2 3 PUBLICATION SOURCES: This journal is suitable to my current research agenda. 1 2 3 PEER EVALUATION: This journal is highly regarded by my peers around the world.

Questions and Orders should be forwarded to: Dr. Albert L. Harris Department of ITOM, Raley Hall Appalachian State University Boone, NC 28608, USA Tel: 828-262-6180; Fax: 828-262-6190 URL: http://www.jise.appstate.edu E-mail: [email protected]

71

Journal of Information Systems Education, Vol 12(2)

72

Membership Application Limited to membership only in the Education Special Interest Group of AITP Please complete all sections of the application. □ CDP (PRINT OR TYPE LEGIBLY) □ CCP

□ CSP □ Former EDSIG Member □ ACP □ Former Student Member

______________________________________________________________________________ Name: First Middle Initial Last ______________________________________________________________________________ Employer Name: Your Title DeptlDiv. ______________________________________________________________________________ Employer Address: City State/Prov. Zip + 4/Postal ______________________________________________________________________________ Homee Address: City State/Prov. Zip + 4/Postal Send Mail to: □ Home □ Company AITP may include my name and address for mail list rentals: □Yes □ No Business Phone:_________________________

Home Phone: ______________________________

Fax: ___________________________ E-Mail Address: _______________________________________ A subscription to the Journal of Information Systems Education is included with your membership dues. $ 65.00 Due with this Application Contributions or gifts to the Association of Information Technology Professionals (AITP dues) are not tax deductible as charitable contributions. However, they may be tax deductible as ordinary and necessary business expenses. Specify Payment Method: □ Visa □ MasterCard □ Check □ Money Order Card Number __________________________________ Expiration Date: _____________________ I hereby make application for membership in AITP. I agree to comply with the requirements of the Bylaws and Code of Ethics and all regulations adopted by the Association. Applicant's Signature: _____________________________________

Date: ______________________

Sponsor's Name: ___________________________________________________ (PRINT LEGIBLY) Association of Information Technology Professionals P .0. Box 388130, Chicago, IL 60638 Canadian Funds to P .0. Box 172, Park Ridge, IL 60068 847-825-8124 800-224-9371 FAX 847-825-1693 [email protected] www.aitp.org

73

74