Assessing Research Methodology Training ... - Wiley Online Library

16 downloads 2568 Views 147KB Size Report
own training needs, and help support the develop- ment of research methodology curricula on regional and national levels to advance the state of research.
280

TRAINING NEEDS

Supino, Richardson • ASSESSING RESEARCH METHODOLOGY TRAINING NEEDS

Assessing Research Methodology Training Needs in Emergency Medicine PHYLLIS G. SUPINO, EDD, LYNNE D. RICHARDSON, MD

Abstract. Objective: To determine the perceptions of emergency medicine (EM) academic faculty leaders and other academic emergency physicians regarding importance and knowledge of specific research methodology content areas and training priorities. Methods: The authors conducted a confidential mail survey of 52 EM academic chairs, 112 residency directors, 116 research directors, and 400 randomly selected other members of the Society for Academic Emergency Medicine (SAEM). Respondents rated the importance of knowledge about each of 12 content areas for enhancing research productivity, rated their own knowledge of these areas, and identified training priorities. Standard descriptive statistics were used to characterize the study population; subgroup differences were examined by nonparametric statistics. Results: 551 (81%) of those sampled returned surveys. Most (90%) respondents thought that knowledge about all selected content areas was important for enhancing research productivity; however, 7 – 37% (depending on the topic) reported little knowledge or

T

HE NEED to identify and implement a relevant research agenda in emergency medicine (EM) has been widely recognized within the specialty.1 – 4 The strengthening of research skills of EM faculty and residents also is an acknowledged priority of both individual academic departments and national organizations.5 – 9 Accordingly, efforts have been made to define a curriculum in research methodology for emergency physicians (EPs),6,7 and various educational programs and other initiatives have been undertaken by experts in the field.10 – 17 However, as has been pointed out by previous authors,17 programs have differed significantly in content and design, and have met with varying success. From the Department of Emergency Medicine, Mount Sinai School of Medicine, New York, NY (PGS, LDR). Current affiliation: Division of Cardiovascular Pathophysiology, Joan and Sanford I. Weill Medical College of Cornell University, New York, NY (PGS). Received September 1, 1998; revision received December 8, 1998; accepted December 10, 1998. Presented in part at the ACEP Scientific Forum, San Francisco, CA, October 1997. Address for correspondence and reprints: Phyllis G. Supino, EdD, Division of Cardiovascular Pathophysiology, The New York Presbyterian Hospital – Cornell Medical Center, 525 East 68th Street, Room F 467, New York, NY 10021; fax 212-7468448; e-mail: [email protected]

experience in specific areas. Research directors reported highest overall knowledge levels (p < 0.001), followed by chairs, residency directors, and other SAEM members. Top training priorities (identified by all subgroups) included study planning (70%), problem identification/hypothesis construction (41%), and proposal writing (38%). Conclusions: These data support the continued need to offer broad training in research methodology, but suggest that greater emphasis be given to concepts involved in initiating and planning a study and to strengthening research proposal writing skills. These results should be of interest to academic departments who must address their own training needs, and help support the development of research methodology curricula on regional and national levels to advance the state of research in the specialty of EM. Key words: emergency medicine; curriculum; research methodology; needs assessment. ACADEMIC EMERGENCY MEDICINE 1999; 6:280 – 285

In recent years, medical educators have begun to emulate faculty from nonmedical graduate institutions by revising their approach to curriculum development and assessment.18 One area of change has been to increasingly supplement expert opinion with formal input from the learner when defining specific content areas for instruction.19 – 23 This is because understanding the learners’ perspectives is crucial for optimizing the match between their perceived needs and program offerings and, thus, increasing the likelihood of achieving educational objectives. To our knowledge, no published data currently are available about the views of EPs regarding their own knowledge of various aspects of research methodology, the value that they place on understanding specific topics, or their educational priorities in this area. We believe such information could help to identify curriculum elements that merit greatest emphasis, and thus inform future research training efforts in our specialty. To address these needs, we conducted a survey of academic leaders and other academic EPs. Its purpose was to determine their perceptions regarding the importance of specific research methodology concepts and skill areas, their views about their knowledge of these areas, and their opinions regarding key training priorities.

ACADEMIC EMERGENCY MEDICINE • April 1999, Volume 6, Number 4

281

METHODS Study Design and Population. A confidential mail survey was conducted of all academic chairs (n = 52), residency directors (n = 112), and research directors (n = 116) named in the 1996 published lists of the Association of Academic Chairs of Emergency Medicine, Council of Emergency Medicine Residency Directors, and SAEM Research Directors’ Special Interest Group. We also concurrently surveyed a 10% random sample (n = 400) of 4,000 other academic EPs identified from the 1996 general SAEM membership list. All academic leaders were eliminated from the general SAEM membership lists prior to randomization, and individuals who held multiple leadership positions within their departments (n = 3) were assigned to a single subgroup. Sample size (n = 677) reflected an assumed 80% response rate and was chosen to permit estimation of parameters across subgroups with 95% confidence and high precision (i.e., an error of at most ⫾4%). The project was approved by the Mount Sinai School of Medicine’s Institutional Review Board. Consent for participation was implied by the receipt of a completed survey instrument. Survey Instrument and Administration. The questionnaire was designed by the coauthors and

Figure 1. Importance ratings for specific content areas. Distributions for the whole population. 1 = problem identification/hypothesis construction, 2 = study planning, 3 = recognizing sources of bias in research designs, 4 = application of alternative study designs, 5 = rationale/procedures for generating reliable data, 6 = data documentation techniques, 7 = use of statistics for estimation and hypothesis testing, 8 = decision analysis, 9 = research ethics, 10 = research proposal writing, 11 = research manuscript writing, 12 = literature analysis.

Figure 2. Knowledge ratings of specific content areas. Distributions for the whole population. Content areas 1– 12, same as for Figure 1.

piloted on a group of academic EPs. The purpose of the pilot was to obtain feedback regarding clarity of wording and ease of completion of questions from individuals who were potential recipients of the survey, but who had not been involved in developing the items. The instrument, which was limited to two pages to maximize response rate, assessed respondents’ perceptions of 1) the ‘‘importance’’ of knowledge about each of 12 selected content areas of research methodology to an academic faculty member’s ability to successfully conduct research, and 2) the extent of their own ‘‘knowledge’’ of these areas. The 12 content areas, listed in Figures 1 and 2, reflect concepts and skill areas previously proposed for EM research training.6,7 Three-point scales were used for both assessments, where 1 = the lowest possible rating and 3 = the highest possible rating. The respondents also were asked to select from the listed areas the three most important training priorities for enhancing their (or their department’s) research productivity; or to indicate other areas in order of priority. In addition, the respondents were queried about their prior and current research activity and training and other selected profile characteristics, using structured closed-ended questions. To encourage candor, questions about the respondent’s name and institutional affiliation were omitted. All forms were coded to document receipt of questionnaires and permit identification of nonrespondents. The survey was mailed to the study population during December 1996, followed by a second mailing to all nonrespondents three months later. Data were collected until September 30, 1997, when the study

282

TRAINING NEEDS

Supino, Richardson • ASSESSING RESEARCH METHODOLOGY TRAINING NEEDS

TABLE 1. Profile Characteristics of the Respondents Residency Director (n = 90)

Academic Chair (n = 40)

Had mentor§ Current mentor‫ن‬a Current PI†,‫ن‬a Current co-PI‫ن‬a History of funding (PI)‫ن‬a History of funding (co-PI)§a Clinical doctorate only‫ن‬a Postdoctoral education‫ن‬a a

Research Director (n = 114)

Frequency

NR*

Frequency

NR

19 21 26 35 24

39 39‫ن‬e 39‡e 39‫ن‬e 39‫ن‬e

36 62 58 73 46

90§ 89‫ن‬e 90‫ن‬e 89‫ن‬e 90‫ن‬e

(48.7%) (53.8%) (66.7%) (89.7%) (61.5%)

28 (71.8%) 39‫ن‬e 33 (82.5%) 40 16 (42.1%) 38§e

(40.0%) (69.7%) (64.4%) (82.0%) (51.1%)

e

53 (58.9%) 90‫ن‬e 80 (88.9%) 90‡d 25 (29.1%) 86

Frequency 69 93 102 106 87

(60.5%) (82.3%) (89.5%) (93.8%) (76.3%)

NR

Other SAEM Member (n = 307) Frequency

NR

114‡ 184 (59.9%) 307 113‫ن‬b,c,e 74 (24.7%) 299 114§b,‫ن‬c,e 130 (42.3%) 307 113‫ن‬e 170 (55.7%) 305 114‫ن‬c,e 77 (25.1%) 307 c

82 (73.2%) 112‫ن‬e 83 (72.8%) 114 45 (42.5%) 106‫ن‬e

110 (35.9%) 306 262 (86.2%) 304§d 58 (20.1%) 289

Total Respondents (n = 551) Frequency 308 250 316 384 234

(56.0%) (46.3%) (57.5%) (70.3%) (42.5%)

NR 550 540 550 546 550

273 (49.9%) 547 458 (83.6%) 548 144 (27.7%) 519

*NR = number of responses. †PI = principal investigator. ‡p < 0.01; §p < 0.005; ‫ن‬p < 0.001; aglobal; bvs chair; cvs residency director; dvs research director; evs other SAEM member.

was closed for analysis, at which time subject identifiers were expunged. Data Analysis. All data were analyzed using SPSS for Windows V.6.1.4 (Chicago, IL). Aggregate ‘‘importance’’ and ‘‘knowledge’’ ratings (minimum: 12, maximum: 36) were calculated by summing across individual content area ratings. Descriptive statistics [median and interquartile ranges (IQRs) or frequency and percentage] were generated to characterize ratings, profile characteristics, and training priorities. Inferential statistics (␹2, Fisher’s exact, Kruskal-Wallis and Mann-Whitney U tests, as appropriate), were used to explore subgroup differences. All pairwise comparisons were two-sided; a critical alpha level of 0.01 was used to reduce the likelihood of a type I error in the comparison of the four subgroups.

RESULTS Respondent Profile Characteristics. Five hundred fifty-one (81.4%) of the 677 individuals sampled returned completed surveys. Three hundred ninety-four (71.5% of respondents) answered the first mailing (‘‘early’’ respondents), and an additional 157 (28.5% of respondents) answered the second mailing (‘‘late’’ respondents). Analysis of profile characteristics, ratings, and top training priorities among the ‘‘early’’ and ‘‘late’’ respondents disclosed no significant difference; these subgroups were therefore aggregated for the purposes of statistical analysis. Of the 551 respondents, 40 (7.3%) were academic chairs, 90 (16.3%) were residency directors, 114 (20.7%) were research directors, and 307 (55.7%) were other members of SAEM, including attending physicians, residents, and other EM specialists. Table 1 summarizes the profile characteristics of the study group. The large majority of re-

spondents previously had a research mentor; however, fewer than a third reported having undergone formal postdoctoral training other than residency. Most respondents reported that they currently were involved as a principal investigator or coinvestigator on a research project. Academic leaders in general, and research directors in particular, were more likely than other SAEM members to report current research activity, mentorial responsibility, or a history of funding. Perceived Importance of Research Methodology Content Areas. Figure 1 shows that the large majority of respondents [494/551 (89.7%)] thought that knowledge of all 12 areas was ‘‘somewhat’’ or ‘‘very’’ important for enabling an academic faculty member to successfully perform research. The median aggregate importance rating was 33 for the population as a whole; The Q1 (25th percentile) and Q3 (75th percentile) values were 30 and 35, respectively (IQR = 5). No significant difference was identified among subgroups. The areas most frequently rated as ‘‘very’’ important by the respondents were ‘‘identifying a problem or constructing a hypothesis’’ [n = 512 (92.9%)] and ‘‘study planning’’ [n = 494 (89.7%)]. Reported Knowledge of Content Areas. Figure 2 illustrates that many respondents reported ‘‘little knowledge or experience’’ in specific areas of research methodology. Low knowledge levels most frequently were reported for ‘‘research proposal writing’’ [n = 205 (37.2%)], ‘‘decision analysis’’ [n = 204 (37.0%)], and ‘‘statistics’’ [n = 203 (36.8%)]. The median aggregate knowledge rating was 24 for the entire population; Q1 and Q3 values were 20 and 30, respectively (IQR = 10). Not surprisingly, when the subgroups were compared pairwise on this variable according to academic position, academic chairs, residency directors, and research directors

283

ACADEMIC EMERGENCY MEDICINE • April 1999, Volume 6, Number 4

(p < 0.005, p < 0.001, and p < 0.001, respectively), each reported significantly higher knowledge levels across the 12 content areas as compared with other SAEM members (Fig. 3). While aggregate knowledge ratings were highest for research directors, only 19 (16.7%) of these respondents indicated that they were ‘‘very knowledgeable’’ in each of the content areas surveyed; moreover, only 70 (61.4%) reported this level of knowledge in half or more of these areas. Low knowledge levels among this subgroup were reported most frequently for ‘‘decision analysis,’’ ‘‘statistics,’’ and ‘‘application of alternative study designs to control bias’’; with 32 (28.1%), 11 (9.6%), and ten (8.8%) of these respondents indicating that they had ‘‘little knowledge or experience’’ in these respective areas. Research Methodology Training Priorities. Five hundred nineteen of the 551 (94.2%) respondents identified research training priorities in 19 areas, for a total of 1,537 responses. As indicated in Table 2, the vast majority (69.7%) thought that training in study planning was most important for enhancing research productivity, and more than a third of those responding identified problem identification/hypothesis construction and research proposal development as additional key areas for training. These areas, in the same order, also were listed most frequently as ‘‘first priorities.’’ Very few respondents (n = 7) identified content area training priorities other than those listed in the survey. Subgroup comparisons revealed no significant difference among academic chairs, residency directors, research directors, and other SAEM members regarding identified priorities or the relative importance (i.e., ranking) of identified training priorities.

DISCUSSION To our knowledge, these findings comprise the first report of the opinions of a national sample of academic EPs and their leaders regarding the importance of selected aspects of research methodology, their knowledge of these areas, and their associated training priorities. The views expressed by the respondents affirm the previous assertions of experts in the field5 – 9 concerning the need to further educate members of our specialty in fundamental aspects of the research process. The high level of perceived importance attributed to all of the concept and skill areas included in this survey is striking, as is the uniformity of opinion among academic chairs, residency directors, research directors, and other EPs. While the content areas surveyed are not exhaustive, they reflect a broad range of concepts that, our respondents believe, should be understood by those who wish to be able

Figure 3. Aggregate knowledge ratings by subgroup. Aggregate ratings are calculated as the sum of individual content ratings across content areas 1 – 12 (see Fig. 1). Distributions are represented as box plots. Tops and bottoms of ‘‘boxes’’ are 75th and 25th percentile values, respectively; horizontal bars are median values; lengths reflect interquartile ranges. Uppermost and lowermost crossbars, respectively, denote maximum and minimum values. Global p-value reflects differences across all subgroups.

to perform research successfully. Notably, all of these topics are included in the ‘‘model research curriculum for emergency medicine,’’ reported in 1992 by the SAEM Research Committee,6 and nine of the 12 also are included within the list of objectives identified for EM resident research in that document. TABLE 2. Research Methodology Training Priorities Content Area Study planning Identifying a problem/constructing a hypothesis Writing a successful research proposal Writing a successful research manuscript Procedures for generating reliable research data Performing a critical analysis of the literature Use of statistics for estimation and hypothesis testing Recognizing sources of bias in research designs Application of valid alternative study designs Decision analysis Rationale for/techniques of data documentation Research ethics Other

Response Frequency

% Total Responses*

362

69.7

214

41.2

198

38.2

139

26.8

127

24.5

118

22.7

108

20.8

105

20.2

71 50

13.7 9.6

21 17 7

4.0 3.3 1.3

*Based on 519 respondents; priorities 1 – 3 are analyzed as a single response set.

284

TRAINING NEEDS

Supino, Richardson • ASSESSING RESEARCH METHODOLOGY TRAINING NEEDS

The majority of respondents reported previous access to a mentor and current involvement in research as either a principal investigator or a coinvestigator. Nonetheless, more than one third viewed their own knowledge as suboptimal in a number of content areas that were regarded as important for success in research. This finding supports the often-voiced claim that EPs, like their counterparts in other specialties,24 – 27 have inadequate training in critical areas of research methodology.28 Not surprisingly, research directors reported the highest knowledge ratings. Even so, many indicated that they had limited understanding of several analytic areas. The fact that knowledge was self-reported makes these findings even more noteworthy, since an individual’s self-reported ability often is inflated.29 Two of the three training priorities most frequently identified by respondents (problem identification/hypothesis construction and study planning) are among the most fundamental skills required to properly launch a project.30 – 33 The ability to write a successful research proposal (the third top priority) clearly is critical both to generating meaningful results and to obtaining funding. Other topics (e.g., those involving statistical reasoning, application of study designs, and data management) also have intrinsic importance to the research process16,30 – 32; however, these were identified less frequently as training priorities, even though many respondents reported limited knowledge of these areas. Perhaps this is because they are relatively abstract and/or technical subjects,34,35 whose practical value is best appreciated after the investigator has succeeded in defining the research problem and has determined the inquiry to be feasible and/or potentially fundable. However, most respondents reported that they were not very knowledgeable about the essentials of study initiation and planning; thus, many may have some difficulty in progressing beyond the early critical phases of research. These limitations, which also have been noted in other specialties,30 may explain why the top three training priorities identified in this study were selected more frequently than other areas in which reported knowledge was even lower. Nonetheless, research methodology curricula in EM,11,12,14 like those developed for other disciplines,36 – 40 have heavily emphasized measurement theory, statistical inference, experimental design, and related topics. As a result of this emphasis, relatively few contact hours typically have been allocated to the principles involved in problem development, study planning, or writing a successful research proposal. Our findings support the continued need to teach a broad range of topics. However, they also suggest that research productiv-

ity may be further enhanced if current curricula are modified to give greater weight to the initial stages of an investigation and to grantsmanship skills.

LIMITATIONS AND FUTURE QUESTIONS Our response rate was high, but not all of those who were surveyed returned completed instruments. To evaluate the potential impact of response bias, we compared ‘‘late’’ respondents (who have been found to resemble nonrespondents)41 with ‘‘early’’ respondents, on profile characteristics, ratings, and identified priorities. No difference was found, strengthening the inference that our findings, in all likelihood, are based on a representative sample. Though piloted for clarity and ease of completion, the survey instrument was not standardized, and test– retest reliability could not be evaluated due to resource constraints. Use of a three-point scale may have limited response options; however, restrictive scaling has been found to enhance reliability and minimize task difficulty when respondents must address questions that they may not have previously considered.42 Additionally, knowledge was self-reported and may not accurately reflect true understanding or correlate well with more objective performance indicators; nonetheless, such perceptions can influence educational choices, and thus merit consideration. Finally, the survey was designed to be brief to optimize response rates. This limited the number of topics that could be examined and restricted our ability to distinguish between the respondents’ own training priorities and those deemed to be important for their departments. Some respondents, particularly those entrusted with directing research, may well have learning priorities that differ from those of others in their departments; these, however, cannot be identified from the current data. Also, due to space limitations, we did not include questions about the value of various pedagogical approaches that could be used to teach specific concepts, the optimal timing for such instruction, or the most appropriate environment for learning these concepts. These issues remain open to future study.

CONCLUSIONS This study demonstrates that the overwhelming majority of academic physicians we sampled believe that understanding diverse concepts in research methodology is important to their ability to successfully conduct research. Despite the fact that academic leaders in general, and research directors in particular, report greater knowledge of these concepts than other academic EPs, unmet

ACADEMIC EMERGENCY MEDICINE • April 1999, Volume 6, Number 4

learning needs appear to exist among all subgroups. Although research education efforts should continue to be comprehensive, there is marked agreement that problem identification, hypothesis construction, study planning, and research proposal development require special emphasis. These findings should be of interest to academic departments who must address their own training needs and may assist those who are responsible for developing research methodology curricula on regional and national levels. The authors express their sincere gratitude to the many academic chairs, residency directors, research directors, and other SAEM members who completed this survey; thank Nina Field for her tireless efforts in data collection and computer data entry; and gratefully acknowledge Sheldon Jacobson, MD, Chair, for allocating departmental resources in support of this study.

References 1. Josiah Macy, Jr Foundation. The role of emergency medicine in the future of American medical care. Ann Emerg Med. 1995; 25:230 – 3. 2. Bowles LT. Macy Foundation Report on emergency medicine: further comment more than one year later. Acad Emerg Med. 1995; 2:1103 – 8. 3. Research Directions Conference. Research directions in emergency medicine. Ann Emerg Med. 1996; 27:339 – 42. 4. Biros MH. Our basic need for basic science research [commentary]. Acad Emerg Med. 1997; 4:652 – 3. 5. University Association for Emergency Medicine. Report of the Task Force on Academic Affairs in Emergency Medicine. Ann Emerg Med. 1988; 17:746 – 9. 6. Cline D, Henneman P, Van Ligten P. A model research curriculum for emergency medicine. Ann Emerg Med. 1992; 21: 184 – 92. 7. Olson JE, Hamilton GC, Angelos MG, Singer JI, Ellers ME, Gaddis M. Objectives to direct the training of emergency medicine residents on off-service rotations: research. J Emerg Med. 1992; 10:631 – 6. 8. Accreditation Council for Graduate Medical Education. Special requirements for residency training in emergency medicine: Section II. In: Directory of Residency Training Programs, 1993 – 94. Chicago, IL: American Medical Association, 1994. 9. Manning JE. Research in emergency medicine. N C Med J. 1997; 58:296 – 9. 10. Brautigan MW. A systematic approach to research curricula for emergency medicine residencies. J Emerg Med. 1984; 1:459 – 64. 11. Jones J, Dougherty J, Cannon L, Schelble D. Teaching research in the emergency medicine residency curriculum. Ann Emerg Med. 1987; 16:347 – 53. 12. Gold I, Jayne HA. Development and evaluation of a onemonth research track in emergency medicine for medical students. Ann Emerg Med. 1987; 16:686 – 8. 13. Whitley TW, Spivey WH, Abramson NS, et al. A basic resource guide for emergency medicine research. Ann Emerg Med. 1990; 19:1306 – 9. 14. Rydman RJ, Zalenski RJ, Fagan JK. An evaluation of research training in a large residency program. Acad Emerg Med. 1994; 1:448 – 53. 15. Jouriles NJ, Cordell WH, Martin DR, Wolfe R, Emerman

285

CL, Avery A. Emergency medicine journal clubs. Acad Emerg Med. 1996; 3:872 – 8. 16. Lewis RJ. Statistical methodology and effective emergency medicine: what is the connection? [commentary]. Acad Emerg Med. 1996; 3:824. 17. Fraker LD, Orsay EM, Sloan EP, Bunney EB, Holden JA, Hart RG. A novel curriculum for teaching research methodology. J Emerg Med. 1996; 14:503 – 8. 18. Halperin EC, Byyny RL, Moore S, Monahan PS. What medical schools and universities can learn from each other. Acad Med. 1995; 70:879 – 83. 19. Holloway RL, Bland CJ, Schmitz CC, Withington AM. An advanced research seminar series for family medicine faculty members. Fam Med. 1988; 20:338 – 42. 20. Henley JE, Anema MG. Curriculum assessment using the essentials of college and university education for professional nursing. Nurse Educ. 1989; 14:18 – 20. 21. Astin A. Assessment for Excellence: The Philosophy and Practice of Assessment and Evaluation in Higher Education. New York: American Council on Education and MacMillan Publishing Co., 1991. 22. El-Guebaly N, Atkinson M. Research training and productivity among faculty: The Canadian Association of Professors of Psychiatry and the Canadian Psychiatric Association Survey. Can J Psychiatry. 1996; 41:144 – 9. 23. Orlander JD, Callahan CM. Fellowship training in academic general internal medicine: a curriculum survey. J Gen Intern Med. 1991; 6:460 – 5. 24. Wulff HR, Andersen B, Brandenhoff P, Guttler F. What do doctors know about biostatistics? Stat Med. 1987; 6:3 – 10. 25. Leyland AH, Pritchard CW. What do doctors know of statistics? [letter; commentary]. Lancet. 1991; 337:679. 26. Weiss ST, Samet JM. An assessment of physician knowledge of epidemiology and biostatistics. J Med Educ. 1980; 56: 692 – 7. 27. Goering P, Strauss JS. Teaching clinical research: what clinicians need to know. Am J Orthopsychiatry. 1987; 57:418 – 23. 28. Cooke M, Wilson S. Obstacles to research in A&E [letter]. J Accid Emerg Med. 1997; 14:269. 29. Howard GS, Schmeck RR, Bray JH. Internal invalidity in studies employing self-report instruments: a suggested remedy. J Educ Meas. 1979; 10:305 – 15. 30. Marks RG. Designing a Research Project. Belmont, CA: Lifetime Learning Publications, 1982. 31. Hulley SB, Cummings SR. Designing Clinical Research: An Epidemiological Approach. Baltimore, MD: Williams & Wilkins, 1988. 32. Andersen B. Methodological Errors in Medical Research. Oxford: Blackwell Scientific Publications, 1990. 33. Leedy PD. Practical Research: Planning and Design. New York: Macmillan Publishing Co., 1980. 34. Sahai H. Some comments on teaching biostatistics in medical and health sciences. Methods Inf Med. 1990; 29:41 – 3. 35. Appleton DR. What statistics should we teach medical undergraduates and graduates? Stat Med. 1990; 9:1013 – 21. 36. Raphael B, Dunne M, Byrne G. A research seminar for doctoral candidates in psychiatry. Aust N Z J Psychiatry. 1990; 24:207 – 13. 37. Penchansky R, Landis JR, Brown MB. Education for clinical research: an experiment at the University of Michigan. Clin Res. 1988; 36:21 – 32. 38. Lehoytay DC, Dugas M, Levey GS. A program for training physician-investigators. J Med Educ. 1982; 57:602 – 8. 39. Fisher S, Bender S. A program of research training in psychiatry: ten-year evaluation and follow-up. Am J Psychiatry. 1975; 132:821 – 4. 40. Fraser RC. Research methods in general practice. J R Coll Gen Pract. 1969; 17:385 – 7. 41. Oppenheim NA. Questionnaire Design and Attitude Measurement. New York: Basic Books, 1966. 42. Rodeghier M. Surveys with Confidence. Chicago, IL: SPSS Inc., 1996.