Aerospace Engineering Report and Appendices

38 downloads 23287 Views 3MB Size Report
Career Services . ... Missions and Objectives as Enunciated by the University, the College, and the .... Basic-Level Curriculum for Aerospace Engineering .
Program Self-Study Report for the Degree of Bachelor of Science in

AEROSPACE ENGINEERING Submitted by the

COLLEGE OF ENGINEERING

UNIVERSITY OF ARIZONA to the

ENGINEERING ACCREDITATION COMMISSION Accreditation Board for Engineering and Technology, Inc. 111 Market Place, Suite 1050 Baltimore, MD 21202-4012

TABLE OF CONTENTS A. BACKGROUND INFORMATION 1. Degree Title .................................................................................................................................. 1 2. Program Modes ............................................................................................................................. 1 3. Actions to Correct Previous Deficiencies ..................................................................................... 1 B. ACCREDITATION SUMMARY 1. Students ........................................................................................................................................ 3 1.1. Admission ............................................................................................................................ 3 1.2. Evaluation of Students: Mathematics, Writing Placement, and Writing Assessment.......................................................................................................... 3 1.2.1. Mathematics ....................................................................................................... 3 1.2.2. Writing Placement .............................................................................................. 4 1.2.3. Writing Assessment ............................................................................................ 4 1.3. Advising and Monitoring/Mentoring .................................................................................. 4 1.3.1. Freshmen Registration/Orientation .................................................................... 7 1.3.2. Transfer Student Orientation: Transfer Credit ................................................... 8 1.3.3. AME Advising Procedure .................................................................................. 8 1.3.4. Student Evaluation, Monitoring/Mentoring ....................................................... 9 1.3.5 Requirements for Graduation ............................................................................. 9 1.4. Career Services ................................................................................................................... 10 1.5. Quality of Students ............................................................................................................ 10 2. Program Educational Objectives ................................................................................................ 13 2.1. Missions and Objectives as Enunciated by the University, the College, and the Department ........................................................................................................... 2.1.1. The Mission of The University of Arizona .......................................................... 2.1.2 The Mission of the College of Engineering ......................................................... 2.1.3. The Strategies/Educational Objectives of the College of Engineering ................ 2.1.4. The Mission of the Department of Aerospace and Mechanical Engineering ....................................................................................... 2.1.5. Overview of the Educational Objectives of the Department and Program ......................................................................................................... 2.1.6. Detailed and Specific Learning Outcomes of the AE Program ............................ 2.1.7. Development of Educational Objectives and Learning Outcomes ....................... 2.2. Constituencies and Stakeholders of the Program .............................................................. 2.2.1. Internal Constituencies ........................................................................................... 2.2.2. External Constituencies ..........................................................................................

Aerospace Engineering

13 13 13 14 14 14 15 15 16 16 17

Page iii

TABLE OF CONTENTS (continued) 3. Program Outcomes and Assessment .......................................................................................... 19 3.1. Program Outcomes ............................................................................................................ 3.2. Assessment Process .......................................................................................................... 3.2.1. Senior Exit Survey ............................................................................................... 3.2.2. AME Letter of Solicitation ................................................................................... 3.2.3. Alumni Survey (by College of Engineering—COE) ............................................ 3.2.4. Evaluation of Senior Design Projects by Judges from Industry............................ 3.2.5. Student Course/Instructor Evaluations ................................................................. 3.2.6. Fundamentals of Engineering Examination ......................................................... 3.2.7. Review and Assessment by Industrial Advisory Council (AME Advisory Board) ................................................................................................................... 3.2.8. Performance Assessment from Industry ............................................................... 3.2.9. Faculty (Undergraduate Studies Committee) Assessment of Curriculum ............ 3.2.10. Academic Program Review .................................................................................. 3.2.11. Job Placement Data .............................................................................................. 3.2.12. Life-Long Learning .............................................................................................. 3.2.13. Summary of Assessment Results ......................................................................... 3.2.14. Improvements as a Result of Assessment Activities ............................................

19 21 32 34 34 49 49 52 55 55 58 58 60 61 61 66

4. Professional Component ............................................................................................................. 71 4.1. 4.2. 4.3. 4.4. 4.5. 4.6. 4.7

Basic-Level Curriculum for Aerospace Engineering ........................................................ Preparation of Students for the Practice of Aerospace Engineering ................................. The Capstone Design Program .......................................................................................... The Technical Elective Program ....................................................................................... General Education ............................................................................................................. Distribution of Units .......................................................................................................... Professional Organizations ................................................................................................

71 71 71 72 72 72 72

5. Faculty ........................................................................................................................................ 77 5.1. 5.2. 5.3. 5.4. 5.5. 5.6.

Preliminary Comments ...................................................................................................... Academic Credentials of the Faculty ................................................................................ Faculty Workload Summary ............................................................................................. Size of the Aerospace Engineering Faculty ....................................................................... Support of Aerospace Engineering by the Mechanical Engineering Faculty .................... Adequacy of the AME Faculty in the Technical Areas of the Aerospace Engineering Program ......................................................................................................... 5.7. Adequacy of the Faculty in Service to Undergraduates in the Aerospace Engineering Program ........................................................................................................

Page iv

77 77 77 77 77 78 78

Aerospace Engineering

TABLE OF CONTENTS (continued) 6. Facilities ..................................................................................................................................... 81 6.1. 6.2. 6.3. 6.4. 6.5. 6.6. 6.7. 6.8. 6.9.

The AME Building ............................................................................................................ Classrooms ........................................................................................................................ Undergraduate Computer Laboratories ............................................................................. Instrumentation Laboratory ............................................................................................... Mechanical Engineering Laboratory (possible Technical Elective for AE) ...................... The Machine Shop ............................................................................................................ The Design Studio ............................................................................................................. Campus Computer Facilities and Services Available to Undergraduate Students ............ Aerospace Engineering Laboratory ...................................................................................

81 81 81 82 82 82 82 82 83

7. Institutional Support and Financial Resources ........................................................................... 85 7.1. 7.2. 7.3. 7.4. 7.5.

How the Budget is Determined ........................................................................................ Institutional Support in Achieving Objectives ................................................................. Faculty Professional Development ................................................................................... Facilities and Equipment (Acquisition, Maintenance, and Operation) ............................ Support Personnel ............................................................................................................

85 85 85 85 85

8. Program Criteria ......................................................................................................................... 87 8.1. Curriculum ........................................................................................................................ 87 8.2. Faculty ............................................................................................................................... 88 APPENDIX I—ADDITIONAL PROGRAM INFORMATION A. TABULAR DATA FOR PROGRAM ........................................................................................ 90 Table I-1. Table I-2. Table I-3. Table I-4. Table I-5.

Basic-Level Curriculum............................................................................................. Course and Section Size Summary ........................................................................... Faculty Workload Summary ..................................................................................... Faculty Analysis ....................................................................................................... Support Expenditures ................................................................................................

91 93 95 96 97

B. COURSE SYLLABI ................................................................................................................... 98 Required AME Courses AME 230 Thermodynamics ........................................................................................... 99 AME 250 Dynamics ...................................................................................................... 100 AME 300 Instrumentation Laboratory .......................................................................... 101 AME 301 Engineering Analysis .................................................................................... 102 AME 302 Numerical Methods ....................................................................................... 103 AME 320 Aerodynamics ............................................................................................... 104 AME 321 Aircraft Performance .................................................................................... 105 AME 323 Gasdynamics ................................................................................................. 106 AME 324a Mechanical Behavior of Engineering Materials ........................................... 107 AME 324b Engineering Component Design ................................................................... 108

Aerospace Engineering

Page v

TABLE OF CONTENTS (continued) AME 331 AME 401 AME 420 AME 422 AME 424 AME 425 AME 427 AME 428 AME 455 AME 463 AME 495s

Introduction to Fluid Mechanics ................................................................... 109 Senior Aerospace Laboratory ........................................................................ 110 Aircraft Conceptual Design .......................................................................... 111 Aerospace Engineering Design ..................................................................... 112 Introduction to Space Technologies .............................................................. 113 Aerospace Propulsion ................................................................................... 114 Stability and Control of Aerospace Vehicles ............................................... 115 Space Mission Conceptual Design ................................................................ 116 Control System Design ................................................................................. 117 Finite Element Analysis with ANSYS .......................................................... 118 Senior Colloquium ........................................................................................ 119

Required Non-AME Courses CE 214 Statics ............................................................................................................ 120 CHEM 103a Fundamentals of Chemistry I ........................................................................ 121 CHEM 103b Fundamentals of Chemistry II ....................................................................... 122 CHEM 104a General Chemistry I Laboratory ................................................................... 123 ECE 207 Elements of Electrical Engineering ............................................................... 124 ENGR 102 Introduction to Engineering .......................................................................... 125 MATH 125 Calculus I ...................................................................................................... 126 MATH 129 Calculus II ..................................................................................................... 127 MATH 223 Vector Calculus.............................................................................................. 128 MATH 254 Differential Equations ................................................................................... 129 MSE 331L Engineering Materials Laboratory ................................................................ 130 MSE 331R Fundamentals of Materials for Engineers ..................................................... 131 PHYS 141 Introductory Mechanics ................................................................................ 132 PHYS 241 Introductory Electricity and Magnetism ....................................................... 133 Technical Electives AME 195D Our Future in Space and Space in Our Future .............................................. 134 AME 412a Mechanical Engineering Design ................................................................... 135 AME 413a Mechanical Engineering Design Laboratory I .............................................. 136 AME 412b Mechanical Engineering Design ................................................................... 137 AME 413b Mechanical Engineering Design Laboratory II ............................................. 138 AME 416 Material Selection ......................................................................................... 139 AME 430 Intermediate Thermodynamics ..................................................................... 140 AME 431 Numerical Methods in Fluid Mechanics and Heat Transfer ......................... 141 AME 432 Heat Transfer ................................................................................................ 142 AME 433 Intermediate Fluid Mechanics ....................................................................... 143 AME 440 Energy Utilization and Management ............................................................ 144 AME 443 Power Systems Analysis ............................................................................... 145 AME 445 Renewable Energy Systems .......................................................................... 146 AME 452 Computer Aided Analysis and Design of Mechanical Systems .................... 147 AME 454 Optimal Control of Parametric Systems ....................................................... 148 AME 460 Mechanical Vibrations .................................................................................. 149 AME 462 Composite Materials ..................................................................................... 150 AME 466 Biomedical Engineering ................................................................................ 151 AME 472 Reliability Engineering ................................................................................. 152 AME 473 Probabilistic Mechanical Design .................................................................. 153 AME 474 Reliability and Quality Analysis ................................................................... 154

Page vi

Aerospace Engineering

TABLE OF CONTENTS (continued) AME 489

Engineering Properties and Micro/Nano Technologies for Biological Systems......................................................................................... 155 BME 410 Biology for Biomedical Engineering ............................................................. 156 BME 411 Physiology for Biomedical Engineering ........................................................ 157 BME 416 Principles of Biomedical Engineering .......................................................... 158 BME 417 Measurement and Data Analysis in Biomedical Engineering ....................... 159 ECE 442 Digital Control Systems ................................................................................ 160 ENGR 498a Cross-Disciplinary Design ............................................................................ 161 ENGR 498b Cross-Disciplinary Design ............................................................................ 162 MSE 110 Introduction to Solid State Chemistry ........................................................... 163 PTYS 403 Physics of the Solar System .......................................................................... 164 C. FACULTY CURRICULUM VITAE .......................................................................................... 165 Ara Arabyan ............................................................................................................................... 167 R. Reid Bailey ............................................................................................................................. 169 Thomas F. Balsa ......................................................................................................................... 171 Francis H. Champagne ............................................................................................................... 173 Cho Lik Chan .............................................................................................................................. 175 Weinong (Wayne) Chen ............................................................................................................. 177 Eniko T. Enikov ......................................................................................................................... 179 Hermann F. Fasel ....................................................................................................................... 181 Barry D. Ganapol ....................................................................................................................... 183 Juan C. Heinrich ......................................................................................................................... 185 Jeffrey W. Jacobs ....................................................................................................................... 187 Dimitri B. Kececioglu ................................................................................................................ 189 Edward J. Kerschen .................................................................................................................... 191 Oleg A. Likhachev ..................................................................................................................... 193 Erdogan Madenci ....................................................................................................................... 195 John J. McGrath ......................................................................................................................... 197 Parviz E. Nikravesh .................................................................................................................... 199 Alfonso Ortega ........................................................................................................................... 201 Kumar Ramohalli ....................................................................................................................... 203 Sergey V. Shkarayev .................................................................................................................. 205 Bruce R. Simon .......................................................................................................................... 207 Anatoli Tumin ............................................................................................................................ 209 John G. Williams ........................................................................................................................ 211 Israel J. Wygnanski .................................................................................................................... 213 Yitshak Zohar ............................................................................................................................. 215 APPENDIX II - INSTITUTIONAL PROFILE (from Dean’s office) Background Information Relative to the Institution ..........................................................................218 Background Information Relative to the Engineering Unit ...............................................................221 Tabular Data for Engineering Unit ....................................................................................................238

Aerospace Engineering

Page vii

Page viii

Aerospace Engineering

A. BACKGROUND INFORMATION 1. Degree Title Bachelor of Science in Aerospace Engineering.

2. Program Modes The Department of Aerospace and Mechanical Engineering (AME) offers all classes and laboratory sessions during the time period of 8:00 am to 5:00 pm, Monday through Friday. The University is on a semester system, including short winter and (somewhat longer) summer sessions. Graduation is possible at the end of the spring (May), second summer (August), and fall (December) semesters.

3. Actions to Correct Previous Deficiencies No deficiencies were identified during the 1998-99 ABET visit, which used the ABET 2000 accreditation criteria. The AME department was one of the first to embrace the (then) new criteria. The Aerospace Program (AE) was fully accredited. The Final Statement of the reviewer observed the following:  “The students in the aerospace engineering program appear interested and supportive of the

program.” 

“. . . the aerospace program appears to have appropriate educational objectives.”



“The faculty members . . . are very well qualified. There appears to be a good balance between those faculty members devoted primarily to teaching and those with a strong research incentive.”

 “The new building and associated equipment resources provide outstanding support for

program objectives and they provide an atmosphere conducive to learning.”  “The [AME] leadership appears to have been able to manage limited resources effectively.”

The reviewer also identified some concerns under ABET 2000. As the AME department was one of the first ones to undergo accreditation under these new and radically different guidelines, not all aspects of the criteria with respect to educational outcomes, assessment, and constituencies were fully in place. We briefly review the considerable progress made since the last review to address the issues:  During the 1998-99 visit, the educational outcomes were the same as ABET Criteria 3, a-k.

Although this was considered to be a “good first step,” focus and refinement were needed. We believe this has been accomplished, as documented in this Self-Study Report. The most recently adopted educational objectives and learning outcomes were discussed at the college and department levels, and were approved by the faculty and the Industrial Advisory Council of the College of Engineering.

Aerospace Engineering

Page 1

 The constituencies of the AME department have been clearly identified and contacted via

surveys. These activities are described in Section B.3. A new AME Advisory Board has been appointed to replace the previous Industrial Advisory Council. Information is documented in Section B.2.2.2.  The assessment loop is fully closed. Each constituency provides timely and periodic feedback

on the academic program. The Advisory Council meetings (and committee meetings (primarily Committee) in order to evaluate objectives and curriculum.

information is discussed at faculty meetings, Industrial with the new AME Advisory Board), and at departmental the Undergraduate Studies Committee and the ABET and incorporate the recommendations into the educational

The reviewer also noted potential concerns regarding Student Advising and a “missing” course, Stability and Control of Aerospace Vehicles. The latter is very easy to address:  AME 427, Stability and Control, has been a required course in the AE program since fall 1998.

It is offered once per year, and the textbook is that by Etkin and Reid. The history why this course was dropped from the requirements is long and convoluted. These reasons are documented in the “due process response” to the 1998-99 ABET visit. The issue is completely moot at this time.  Prior to 2003 student advising had been performed using several models. These included using

a Ph.D. student as the principal advisor and, more recently, the use of a part-time (20 hours per week) professional (Adjunct Professor) as the principal undergraduate advisor. These student advisors reported directly to the Department Head. These advisors were assisted by the undergraduate secretary, another staff member (who handled Advanced Standing), and faculty members (who typically provided technical advice and career guidance), one each for Freshmen, Sophomores, Juniors, and Seniors. As a result of feedback in the assessment process (from students, alumni, and faculty), it was determined that the effectiveness of student advising required improvement. To address this, the new Department Head (McGrath) created a new administrative structure with two Associate Heads—including an Associate Head for Undergraduate Studies. This person (Frank Champagne) is a Full Professor who has won undergraduate teaching awards. He represents a department commitment—emphasizing the importance of the undergraduate program in general and student advising in particular. He heads the undergraduate advising team consisting of a part-time (20 hours per week) Ph.D. student trained as an advisor, an undergraduate secretary, another staff member (who handles Advanced Standing), and four other faculty members who serve as class advisors. Dr. Champagne is Chair of the Undergraduate Studies Committee. This new administrative structure is also in response to feedback from the University of Arizona Academic Program Review report (December 2001), recommending that: “. . . it may be more effective to structure the department with two Associate Heads, . . . one can focus on the undergraduate programs while the other is responsible for the graduate programs.” and that previously: “. . . the limited scope of responsibility for the Associate Head of AME has not led to effective use of such a position.” The new administrative structure addresses both issues.

Page 2

Aerospace Engineering

B. ACCREDITATION SUMMARY 1. Students Excerpt from ABET Criteria: “The quality and performance of the students and graduates are important considerations in the evaluation of an engineering program. The institution must evaluate, advise and monitor students to determine its success in meeting program objectives.” The College of Engineering is taking many strides to help promote the field of engineering. Through visits to elementary, junior high, and high schools, as well as numerous events on campus, Engineering Ambassadors, faculty, and staff educate young people on the many opportunities in engineering. In addition, the College partners with other University recruitment programs, such as the Multicultural Engineering Program (MEP), the Arizona Mathematics, Engineering, and Science Achievement (MESA) program, and the Women in Science and Engineering (WISE) program (see Attachment 1.a for details). An AME faculty member (Ortega) was awarded the inaugural Southern Arizona MESA Distinguished Service Award for “excellence in outreach to the students and staff of MESA” in April 2004. 1.1.

Admission

Admission to The University of Arizona (and hence to AME) may proceed on three different paths: direct admission from high school (in- or out-of-state) as a freshman, admission from a community college (in-state), and transfer from another university or community college (in- or out-of-state). The admission standards are dictated by the University and may be found on the web (www.admissions.arizona.edu) and are summarized in Attachment 1.a. Each of these paths has its own advising challenge—the most difficult one being the evaluation of transfer credits from an outof-state institution. Every student is given a course grid (see Table 3.1.d) during the initial meeting with the AME advisor. This information is also available electronically at http://www.engr.arizona.edu/acadaff/curricula/034/ae.html. The general education, technical elective, and prerequisite requirements are also explained. 1.2. 1.2.1.

Evaluation of Students: Mathematics, Writing Placement, and Writing Assessment Mathematics

Prior to taking any mathematics course below the level of Calculus II (in AE this is normally MATH 129), all students must take the UA Mathematics Readiness Test (MRT) offered (by the Testing Office) throughout the semesters, usually three times per week, including Freshmen and Transfer Orientations. This requirement applies to transfer students with or without college-level mathematics credit, and to students with credits by examination such as AP or CLEP. There are (almost) no exceptions. The MRT score determines the highest-level MATH course in which the student is eligible to register. The details of an MRT “waiver” are available on the web site of the Mathematics Department (http://math.arizona.edu/~krawczyk/freshmen/satrules.html).

Aerospace Engineering

Page 3

1.2.2.

Writing Placement

First-year Composition courses are primarily concerned with writing at the University level. For that reason, for many years we administered a short essay test which we combined with standardized test scores to make decisions about English placement. However, many students felt that a single writing sample was not a very reliable measure of their writing ability. Starting in summer 2000, therefore, a new procedure was developed that allows a placement to be provided on the basis of data from high school records. The following information is considered in determining placement in English:     

UA admissions GPA (which includes those courses required for admission) GPA in English courses (freshman through junior year) The number of AP English and/or Honors English classes taken through the junior year SAT verbal and/or ACT English scores If both SAT and ACT scores are available, the placement is determined using each score, and the higher of the two placements is given if there was a difference.

The English composition requirements are ENGL 101 and 102, and freshmen can immediately register for ENGL 101 if their placement is high enough as determined by the English Department. Students for whom English is their second language (ESL) have alternative courses available. Those students who are exceptionally well-prepared (as measured by the tests above) may take honors classes. 1.2.3.

Writing Assessment

Every undergraduate degree program requires satisfaction of the Mid-Career Writing Assessment (upon completion of 40 units toward degree). This assessment, which replaced the Undergraduate Writing Proficiency Exam as of Summer 2002, is based on students’ performance in their secondsemester English composition course. A grade of A or B in one of the following courses will satisfy this University writing proficiency requirement as set by the University:    

English 102 or English 108 (for ESL students) or English 104H (for Honors students) or English 109H (for students earning a 4 or 5 on the AP exam)

Transfer courses in composition are evaluated by the English Writing Program composition coordinators. More information is available at http://w3.arizona.edu/~writprog/students.htm. Aerospace Engineering students earning less than a B in their second-semester composition course have to complete an additional English course with a grade of C or better: English 207 or English 308 (the latter may be used as a technical elective). 1.3.

Advising and Monitoring/Mentoring

In this information age, much of the data, either for long-term planning purposes or day-to-day activities, are available on various web sites of the University, College, or Department. The students are encouraged and reminded to frequently consult these sources of information. The three things that the students must keep their eyes (and minds) on are:  Satisfy all prerequisites before enrolling in a course, i.e., follow the curriculum grid.  Advanced standing requirement (details below): minimum GPA, 2.5/4.0.  UA graduation requirement: minimum GPA, 2.0/4.0.

Page 4

Aerospace Engineering

Both the advanced standing requirement and graduation GPAs serve as standard evaluations of students (see Table 1.3.a for a sample of GPAs at graduation). About 52% of the students graduate with a GPA > 3.000. The attrition rates for the College of Engineering are shown in Figure 1.3.a. We believe that these data are representative of AE. In general, the retention rate is about 83%; the average number of years to graduate is about 4.65 (Figure 1.3.b). All trends are encouraging in the sense that the retention rate has increased by 6% (Figure 1.3.a) and the number of years to graduate has decreased by 0.1 year (Figure 1.3.b). Enrollment statistics for Fall 1998-Spring 2004 are shown in Figure 1.3.c. Total enrollment has increased by 31% from 1998 to 2004.

Table 1.3.a Number of graduates in each GPA range for Aerospace Engineering majors.

Month/Year of GPA Percentage Graduation > 3.000 2.000-2.500 2.501-3.000 3.001-3.500 3.501-4.000 5/98 0 5 0 3 8/98 0 1 0 0 50 12/98 0 1 4 0 5/99 8/99 12/99

0 2 1

5 0 1

1 0 2

0 0 1

31

5/00 8/00 12/00

3 0 1

4 0 2

1 0 2

2 0 1

38

5/01 8/01 12/01

1 0 0

3 0 2

5 1 1

4 0 0

65

5/02 8/02 12/02

0 0 3

6 1 2

10 1 0

1 0 1

52

5/03 8/03 12/03

0 0 0

7 0 1

4 3 3

2 0 3

65

Aerospace Engineering

Page 5

88 86 Percent

84 82 80 78 76 74 1998

1999

2000

2001

2002

2003

FY

Figure 1.3.a Undergraduate degree-track students—annual percentage retained—College of Engineering.

4.72 Number of years

4.7 4.68 4.66 4.64 4.62 4.6 4.58 1998

1999

2000

2001

2002

2003

FY

Figure 1.3.b Undergraduate degree-track students—number of years to graduate—College of Engineering.

Page 6

Aerospace Engineering

UNDERGRADUATE ENROLLMENT AEROSPACE ENGINEERING 300

250

200

150

100

50

0

F'98

S'99

F'99

S'00

F'00

S'01

F'01

S'02

F'02

S'03

F'03

S'04

Fresh

84

54

82

56

80

48

91

28

73

26

78

69

Soph

34

39

46

44

50

54

50

65

67

73

72

64

Jr

26

33

40

36

31

33

33

35

40

49

43

41

Sr

30

32

35

40

46

53

45

66

60

62

63

54

Totals

174

158

203

176

207

188

219

194

240

210

256

228

Figure 1.3.c Enrollments: Aerospace Engineering (Fall 1998-Spring 2004).

1.3.1.

Freshmen Registration/Orientation

The Registration/Orientation program, organized by The University of Arizona, is designed to assist new students and their guest(s) by providing interaction with student leaders, other students, faculty advisors, and staff. A few pages from the booklet for the 2004 Freshmen Orientation sessions distributed by the University is included as Attachment 1.3.a.* The times and locations for sessions hosted by the College of Engineering are included as Attachment 1.3.b. Registration/Orientation programs last two days and offer new students an opportunity to learn about aspects of campus life so they can enjoy a smooth transition into their new surrounding at The University of Arizona. Emphasis is placed on preparing students for academic success by providing:      

Placement examinations Description of curricula Meetings with advisors Registration for courses Description of available resources (financial, academic) Information on student chapters of professional societies, residence life, management of time and study skills, and recreation

A required course, ENGR 102, also provides a means for advising freshmen via departmental visits (tours) and open houses. The main (Monday) lecture of this course also deals with career planning. The last assignment, emphasizing report writing and communication, is a “design project” to create a career plan. For details, see the syllabus for this course in Appendix I.B.

*

All attachments appear in a separate volume.

Aerospace Engineering

Page 7

1.3.2.

Transfer Student Orientation: Transfer Credit

These one-day orientation programs, organized by the College of Engineering, provide academic advising, registration, and general information on the AE program and the College of Engineering as the transfer students begin their education in a new setting (see Attachment 1.3.b). Upon admission of a transfer student, the UA Registrar will transfer the courses from a sister institution. Once this transfer is made, it is the responsibility of the AME academic advisors to identify an equivalent course in the AE program, if warranted. In this endeavor, the Undergraduate Studies Committee or individual faculty members may be consulted. The situation is much simpler for transfers from Arizona community colleges or universities, for which course equivalency guides are available (Attachment 1.3.c). This assessment by the advisor(s) provides the necessary information to enable the student to register for courses in the AE program, and still satisfy the prerequisites. 1.3.3.

AME Advising Procedure

The goal of the advising team is to provide AE students with accurate and timely advising for their programs and needs. Between 1998 and 2003, an academic professional (Adjunct Professor, 20 hrs/week) was the principal contact for undergraduate advising. He reported directly to the Department Head during weekly meetings. The advisor was assisted by the undergraduate program assistant and the advanced standing coordinator. The Department Head took a proactive role in the issues related to advising. The faculty at large served as advisors on career and research questions. The undergraduate advisor often taught courses so he was very much a part of the educational scene in AME. In the Fall of 2003, the Department established an undergraduate advising team (Attachment 1.3.d; http://www.ame.arizona.edu/advising/advising.php) consisting of the Associate Head for Undergraduate Studies, a graduate student trained to be an undergraduate advisor, an assistant for the undergraduate program, and an Advanced Standing coordinator. A faculty mentor has also been assigned to each class: Freshmen (S. Shkarayev), Sophomores (E. Kerschen), Juniors (A. Tumin), and Seniors (W. Chen).

Prof. F. Champagne Associate Head Undergrad Program

Arvind Raman

Connie Spencer

Undergrad Advisor

Program Coordinator Advanced Standing

Becky Ruth Student Academic Specialist

Prof. E. Kerschen

Prof. S. Shkarayev

Prof. A. Tumin

Prof. W. Chen

Faculty Advisor

Faculty Advisor

Faculty Advisor

Faculty Advisor

AE Freshman

AE Sophomores

AE Juniors

AE Seniors

Page 8

Aerospace Engineering

The College of Engineering is also an important partner in advising (http://www.engr.arizona.edu/3_allhtm/stuadvise.htm). The Associate Dean for Academic Affairs fills a special role in this capacity by often dealing with students who have unique difficulties (see also Attachment 1.3.e). The College also provides assistance with senior degree checks. In addition, the College of Engineering provides many support services for the students through the Engineering Academic Center and Probation Workshops (see Attachment 1.3.f). The milestones in advising are (http://www.engr.arizona.edu/2_AC/polici.htm):       

Fulfillment of First-Year Composition requirements (6-9 units, depending on grade) Fulfillment of General Education requirements (18 units) Fulfillment of AE required courses (98 units) Fulfillment of AE technical electives (6 units) Advanced standing Senior degree check Special problems (transfer courses, substitutions, etc.)

The total number of units needed for graduation is 128. The College of Engineering requires that students be granted Advanced Standing to enroll in 300- and 400-level courses in the College. The College’s criteria for Advanced Standing are presented on-line at http://www.engr.arizona.edu/2_AC/polici.htm#advancedstanding. The students must complete all the required courses listed for the freshmen and sophomore years of the Aerospace Engineering Program (see Attachment 1.3.g). For students in Aerospace Engineering, the minimum GPA needed to obtain Advanced Standing is 2.5. 1.3.4.

Student Evaluation, Monitoring/Mentoring

 First and foremost, students are assigned a grade in every course. This grade is typically based

on homework assignments, mid-term test(s), and a final examination (also possibly design projects). The course grade can take on values, A (excellent) to E (failure).  Students are monitored and mentored throughout their career, but especially prior to advanced

standing and the senior degree check.  Academic advising is available at several levels and can span questions about specific courses

to broad career and research issues.  Students are encouraged to engage in research or in independent study, which provide excellent

venues for mentoring and monitoring. Students are encouraged to take an active role in professional societies (i.e., student chapter of AIAA). These have faculty advisors (mentors) and typically involve some socialization and hands-on projects (http://www.ame.arizona.edu/student/student.php). 1.3.5.

Requirements for Graduation

The requirements for each engineering degree are described by the university Student Information System in the form of a Student Academic Progress Report (SAPR) (http://www.arizona.edu/academic/oncourse/data/interface/uainfo.shtml) so that at any time students can see how the courses they have completed apply to degree requirements and what is remaining. Based on the SAPR an automatic degree audit checks to see that all course requirements are met prior

Aerospace Engineering

Page 9

to graduation. Any adjustments to the curricula are made under the direction of a faculty advisor who verifies that when an adjustment is made all ABET criteria are still fulfilled. The SAPR is interpreted by each departmental faculty in the form of a matrix of required courses (http://www.engr.arizona.edu/2_AC/curric.htm) to guide students and advisors about the recommended sequencing to allow all prerequisites to be met and still graduate in a 4-year period. The University requires a GPA of at least 2.0 for graduation. In addition, the College of Engineering requires that graduates have major averages of 2.0. 1.4.

Career Services

Students obtain degrees so that they can be hired for challenging and rewarding jobs. In this regard, the University Career Services Office (http://www.career.arizona.edu/) provides the following important services:  Career fair—employers visit the UA and have “display booths” available for their companies       

so students can learn about them Campus job interview programs Electronic career-search programs; career and corporate information Co-op and intern programs Resume critiques and referrals How-to-sell-yourself workshops (resume and interview skills, the do’s and do not’s) Special needs—services for students with disabilities Career Information Center (computing information, reference books, job listings, magazines, etc.)

In response to the Alumni Survey, the College of Engineering has formalized a student support system, “Launching Your Engineering Career.” Details can be found in Attachment 1.4.a. 1.5.

Quality of Students

The AME department is very proud of its students. Some compete at the national level for prestigious honors and recognition. A short list of such students is given in Table 1.3.b. A list of elite graduate schools to which some of the recent graduates have been accepted is also provided.

Page 10

Aerospace Engineering

Student Damian Athey Thomas Bronson Jeff Cheek Scott Clark Sheila Czech John Dirner John Keffler Elizabeth McBride Tiffany Miller Michael Oddy Gregg Radtke Joshua Scott Luis Tapia Joshua Tor

Table 1.3.b Student accomplishments. Accomplishments SRC/URE (Semi-Conductor Research Corp./Undergraduate Research Experience) Fellowship (2002-03); Summer Internship at Raytheon (2003) Graduated 12/00; received a graduate assistantship in combustion at UC, Berkeley NASA Fellowship (UG) NSF/REU (Research Experience for Undergraduates) Fellowship; Boeing Fellowship; Summer Internship at Sandia National Labs (2004) Expected graduation 5/04; GEM Fellowship (minority students) for graduate school; Summer Internship at Argonne National Lab Graduated magna cum laude 5/01; received a graduate assistantship in combustion at UC, Berkeley Graduated magna cum laude 12/01; NSF CATTS Fellowship (200405) Graduated magma cum laude (5/97; Department of Defense Graduate Fellowship NSF/REU Fellowship Graduated summa cum laude (5/98); Department of Defense Graduate Fellowship (1998-2000) Graduated summa cum laude 12/01; NSF Graduate Fellowship (2002-05) NSF/REU Fellowship (2004) NASA Fellowship (UG); graduated 5/01; Sloan Foundation Minority Graduate Fellowship (2002-04) SRC/URE Fellowship (2002-04); Summer Internship at Intel (2003)

Elite Graduate Schools where AME Graduates Have Been Accepted University of California, Berkeley California Institute of Technology University of Chicago Georgia Institute of Technology University of Illinois, Urbana-Champaign University of Pennsylvania Pennsylvania State University Purdue University Stanford University

Aerospace Engineering

Page 11

Page 12

Aerospace Engineering

2. Program Educational Objectives 2.1.

Missions and Objectives as Enunciated by the University, the College, and the Department

Excerpt from the ABET Criteria: “Each engineering program for which an institution seeks accreditation or re-accreditation must have in place: (a) detailed published educational objectives that are consistent with the mission of the institution and these criteria (b) a process based on the needs of the program's various constituencies in which the objectives are determined and periodically evaluated (c) a curriculum and processes that prepare students for the achievement of these objectives (d) a system of ongoing evaluation that demonstrates achievement of these objectives and uses the results to improve the effectiveness of the program.” The missions and general educational objectives of the University, the College of Engineering, and the Department of Aerospace and Mechanical Engineering are published on the Web (http://www.engr.arizona.edu/2_AC/acproassess.htm). They are listed here for ease of reference. 2.1.1.

The Mission of The University of Arizona

To discover, educate, serve, and inspire. As a public land-grant institution, the University of Arizona provides an accessible environment for discovery where distinguished undergraduate, graduate, and professional education are integrated with world-class basic and applied research and creative achievement. The University prepares students for a diverse and technological world while improving the quality of life for the people of Arizona, the nation, and the world. The University of Arizona is among America's top research universities (based on NSF total research expenditure data) and is one of about 60 select institutions recognized by membership in the Association of American Universities. University vision: To be a preeminent student-centered research university. 2.1.2.

The Mission of the College of Engineering

Through excellence in education and research, and in partnership with industry, government, and the citizens of Arizona, we will:  focus on improving service to our students and other customers  emphasize fundamentals for lifelong learning  lead in improving the nation's strategically important engineering technologies 

Aerospace Engineering

Page 13

2.1.3.

The Strategies/Educational Objectives of the College of Engineering

Provide a world class education for our students.  Strive to provide high quality broad based education that will prepare students for productive

careers in an increasingly diverse and technological society by insuring that graduates have: • • • • •

An ability to function on multi-disciplinary teams. An understanding of professional and ethical responsibility. An ability to communicate effectively. The broad education necessary to understand the impact of engineering solutions in a global/societal context. A knowledge of contemporary issues.

 Provide a foundation for lifelong learning to nurture personal and professional growth by

insuring that graduates have: •

A recognition of the need for and an ability to engage in life-long learning.

 Base student’s education on a knowledge of engineering and science tools appropriate to

their disciplines by insuring that graduates have: • • • • •

An ability to apply knowledge of mathematics, science, and engineering. An ability to design and conduct experiments, as well as to analyze and interpret data. An ability to design a system, component, or process to meet desired needs. An ability to identify, formulate, and solve engineering problems. An ability to use the techniques, skills, and modern engineering tools necessary for engineering practice.

 Continuously improve the undergraduate academic programs in partnership with

industry, alumni, and government by: • • • •

Attracting and enrolling an excellent and diverse student body. Assisting students in gaining career experience and placement. Maintaining an Industry Advisory Council that is actively involved in educational improvement. Seeking the opinions of industrial and governmental partners and alumni relative to student outcomes and other educational matters

2.1.4.

The Mission of the Department of Aerospace and Mechanical Engineering

The mission of the Department of Aerospace and Mechanical Engineering of the University of Arizona is: to provide rigorous and challenging educational experiences at both the undergraduate and graduate levels; to conduct research that is of national and international repute while, in so doing, contribute to the economic development of the state; and to provide service to the University, the state, and the profession of engineering. In meeting this mission, we emphasize standards of the highest quality in teaching, research, and service. 2.1.5.

Overview of the Educational Objectives of the Department and Program

The Educational Objectives of the Aerospace Engineering program are:  Prepare a diverse student body for a professional career in Aerospace Engineering.  Prepare students to enter graduate school in Aerospace Engineering and closely related fields.

Page 14

Aerospace Engineering

 Provide a broad educational background and analytical problem-solving skills for successful

transition to careers in other fields.  Develop skills for clear communication and responsible teamwork, foster professional and ethical

attitudes and a sense of social responsibility, and instill a passion for life-long learning. 2.1.6.

Detailed and Specific Learning Outcomes of the AE Program

Recognizing the need for learning outcomes that can be measured, the following specific Learning Outcomes are defined in accordance with ABET Criteria 3(a) through 3(k) as indicated by square brackets.  Can integrate knowledge of mathematics, science, and engineering to model and analyze      

problems. [3a-c, 3e] Can use state-of-the-art resources to solve engineering problems. [3a-c, 3e, 3i, 3k] Can apply engineering knowledge to design and build processes and systems. [3a-c, 3e, 3k] Can plan experiments, analyze data, and interpret results. [3a-c, 3e, 3k] Can communicate effectively (oral and written). [3c, 3d, 3g, 3h] Can function in multidisciplinary teams. [3c, 3d, 3f-h, 3j] Can exercise professional, ethical, and social responsibilities, and engage in life-long learning. [3f, 3h-k]

These outcomes will be discussed in detail in Section B.3.1. The AME AE curriculum is also designed to satisfy the ABET Program Criteria for Aerospace Engineering as defined by the professional societies:  Aeronautical engineering graduates must have demonstrated knowledge of aerodynamics,

aerospace materials, structures, propulsion, flight mechanics, and stability and control.  Astronautical engineering graduates must have demonstrated knowledge of orbital mechanics,

space environment, attitude determination and control, telecommunications, space structures, and rocket propulsion.  Aerospace engineering graduates, or graduates of other engineering programs combining

aeronautical engineering and astronautical engineering, must have demonstrated knowledge covering one of the areas—aeronautical engineering or astronautical engineering as described above—and in addition must demonstrate knowledge of some topics from the area not emphasized.  Graduates must also have demonstrated design competence, which includes integration of aero-

nautical or astronautical topics. 2.1.7.

Development of Educational Objectives and Learning Outcomes

The educational objectives and learning outcomes, consistent with ABET 2000, have been established. There were eleven educational objectives, which were the same as the learning outcomes, in the previous ABET report during the 1998-1999 accreditation cycle, which was also based on ABET 2000 EAC. These educational objectives generally fell along ABET Criteria 3(a)-(k). During the spring of 2003, these objectives were reviewed by the College of Engineering and AME ABET

Aerospace Engineering

Page 15

Committees and reduced to a much smaller set that defines the overarching educational objectives. In essence, the new objectives explicitly state the four main educational objectives; the details are still contained in the learning (program) outcomes. In contrast to the previous ABET review, we now much more clearly distinguish the overall educational objectives from the learning outcomes. The departmental ABET Committee and the Department Head carried out several iterations of the modified educational objectives and learning outcomes, in order to refine them for the AE program. The final version was discussed at two faculty meetings and approved by the members. Input from industry was sought through a survey, and the new objectives and outcomes will be discussed by the new AME Advisory Board. 2.2.

Constituencies and Stakeholders of the Program

The Department, with input from the faculty, the Department Head, and the College, has determined that the constituencies of the AE program shall be: students (and their families) and alumni of the program, faculty members, employers of our graduates, the Arizona taxpayers, and benefactors of the Department. All of these stakeholders subscribe to the basic premise of any good educational program, namely, that the graduates shall be mature and responsible citizens of the highest ethical standards who are able to advance the engineering profession and to contribute meaningfully to the technical and economic growth of our society. These contributions shall be made in accordance with the specific educational objectives stated above. A significant part (about 38%) of the financial support for the University is provided by the State. The rest of the support comes from research grants and tuition/gifts. The educational objectives and the corresponding learning outcomes were developed in conjunction with constituents. The curriculum is designed to achieve the learning outcomes. Assessment of the program is performed regularly (see Section B.3.2 for assessment process) in order to review the educational objectives and learning outcomes. The Undergraduate Studies Committee is responsible for reviewing all the assessment results, interacting with all the constituents, and initiating an improvement plan. Communication with the constituencies is maintained as follows: 2.2.1.

Internal Constituencies

Students: They are heard informally throughout the academic year. Such input is communicated to the faculty, administrators, and staff. Specifically, they can file petitions, request funding for projects, suggest changes in the operation of the machine shop or computer laboratories (such as keeping these open during off-hours or on weekends), or communicate with the faculty via their student organizations, such as the AIAA Student Chapter. It is believed that the Department listens to its students and has an excellent rapport with them. The new Bylaws call for two students to serve on the Undergraduate Studies Committee. A new Undergraduate Advisory Committee will be created in Fall 2004. This committee will be comprised of 12 undergraduate students. The committee will meet with the AME Advisory Committee and Department Head at least one per year. Faculty: A list of faculty committees and their current membership is given in Attachment 2.2.a. The committees are formed according to the Department Bylaws (see Attachment 8.2.a). Faculty meetings are held about once per month, but over the past two years they have been held more frequently. A typical agenda is given in Attachment 2.2.b, along with the minutes for the

Page 16

Aerospace Engineering

meeting. The issues discussed are multi-faceted, and the chairs of most committees give brief reports on issues facing them and timelines for dealing with them. Major curriculum or program changes must be approved by the faculty, per the voting procedure described in the Bylaws. A typical example of a major change may be a significant refocus in the content of a required course or a realignment of the curriculum grid. It is believed that the operation of the AME department along these lines is the norm with respect to our academic peers. 2.2.2.

External Constituencies

Employers of our Graduates: One of the principal avenues for communication is the Industrial Advisory Councils of the College of Engineering and of the Department: These are two different committees that meet at least twice each year. The former meetings are chaired by the Dean, in which broad issues relating to departmental reorganization, future research thrusts, the needs of industry and the like are discussed. These college meetings are attended by the Heads and the Associate Deans. Members of this committee are, for the most part, from industry although one or two are from academia (Attachment 2.2.c). The agenda for the last meeting held during Fall 2003 is given in Attachment 2.2.d. The Industrial Advisory Council (IAC) for the Department had been composed of 11 members (representing 9 companies), appointed by the Department Head. The Department Head, Associate Head, and selected faculty members attend these meetings (depending on the topics discussed). The principal administrative assignment of the former Associate Head (Erdogan Madenci) was the development of the Industrial Partnership Program and communication with our important industry constituency. The Associate Head chaired the meetings, which had been held at least twice per year. Dr. Madenci’s success since the last ABET review (1998-99) can be measured by the development of a ProE Laboratory course, CNC machining capability (that benefited not only the curriculum, but also the entire department in a state-of-the-art utilization of the AME machine shop), and raising in excess of $30K from our industrial partners; this is significant in view of the small AME resources. A list of corporate members as of May 2001 and their affiliations is given in Attachment 2.2.e. The agenda from a meeting is given in Attachment 2.2.f. Feedback from the Academic Program Review report (December 2001) recommended: “A strengthening of the Industry Advisory Council and an expansion of its role in support of the department can be of great benefit.” It was nearly a year after the APR report was issued that the present Department Head arrived on campus. At that time unprecedented budget cuts were implemented. Since then a great deal of time has been focused on dealing with issues related to those cuts and other recommendations provided by the Academic Program Review report. The feedback included the recommendation that the AME department take advantage of linkages and move in new directions: “The opportunity provided by these inter-departmental and inter-college activities can be a key in expanding this participation in research to levels comparable to other departments at the College of Engineering and Mines, and other research universities.” These opportunities have been pursued aggressively as the new Department Head has spent considerable time meeting people at the University of Arizona, throughout Arizona, and nationally

Aerospace Engineering

Page 17

with the purpose of creating new opportunities for AME faculty and students. These contacts have led to the creation of a new AME Advisory Board (see Attachment 2.2.e for membership details). The members of this board represent a prominent and diverse body representing industry, academia, and alumni. The new AME Advisory Board will meet during the summer or early fall of 2004. This new board has an expanded role—beyond that of industrial relations alone. Among other things, it will be instrumental in working with the AME department in the creation of a shared vision (this addresses another recommendation of the Academic Program Review report) and provide multi-dimensional feedback for program assessment. Although the creation of this board has taken longer than expected, informal contact with five of the eleven previous IAC members has been maintained, as well as contact with numerous industry representatives other than those on the previous IAC. Specific details describing the nature of these interactions are included in Section B.3.2.7 (assessment section). During this past year we also conducted an additional industry survey as a means of seeking additional assessment feedback from industry during the transition to the new board. Specific details describing the feedback obtained is included in Section B.3.2.8. Alumni: The AME Department receives input from its alumni in a number of ways. These include conducting formal alumni surveys, inviting alumni to serve as judges of capstone design projects each semester, and during informal visits of the alumni to campus. We have regular contact with those who work in local companies (e.g., Advanced Ceramics Research, Ventana Medical Systems, Raytheon, Sargent Controls, etc.). We depend on these dedicated alumni to participate in the capstone design projects by supporting and supervising design-team activities, serving as judges at design reviews, and by providing formal and informal feedback about our graduates, the needs of corporations, and curriculum improvements. Some of them have served on the Industrial Advisory Council. They are also represented on the new AME Advisory Board.

Page 18

Aerospace Engineering

3. 3.1.

Program Outcomes and Assessment Program Outcomes

Table 3.1.a describes the mapping of the University of Arizona Aerospace Engineering program Educational Objectives into the Learning Objectives. Table 3.1.a Mapping of educational objectives into learning outcomes.

1. Can integrate knowledge of mathematics, science, and engineering to model and analyze problems.

2. Can use state-of-the-art resources to solve engineering problems.

3. Can apply engineering knowledge to design and build processes and systems.

4. Can plan experiments, analyze data, and interpret results.

5. Can communicate effectively (oral and written).

6. Can function in multidisciplinary teams.

7. Can exercise professional, ethical, and social responsibilities, and engage in life-long learning.

LEARNING OUTCOMES

1. Prepare a diverse student body for a professional career in Aerospace Engineering.

9

9

9

9

9

9

9

2. Prepare students to enter graduate school in Aerospace Engineering and closely related fields.

9

9

9

9

9

9

3. Provide a broad educational background and analytical problem-solving skills for successful transition to careers in other fields.

9

9

9

9

9

9

9

EDUCATIONAL OBJECTIVES

4. Develop skills for clear communication and responsible teamwork, foster professional and ethical attitudes and a sense of social responsibility, and instill a passion for life-long learning.

9

In a similar spirit, Table 3.1.b represents the mapping of ABET Criteria 3 into learning outcomes. It is concluded that the outcomes are well correlated with these criteria, as seen from the presence of a wide swath of checkmarks along the main diagonal of the table. Thus the learning outcomes are entirely consistent with those suggested by ABET. The AME Department is responding to and supporting the accreditation Criteria 3 set by ABET and its constituents. The learning outcomes are addressed by a demanding curriculum (128 units), solidly based in the engineering sciences and engineering design, and by an excellent faculty of uncompromising standards and dedication. Some of the principal features of the curriculum are listed below.

Aerospace Engineering

Page 19

Table 3.1.b The relationship between ABET Criteria 3 and learning outcomes.

4. Can plan experiments, analyze data, and interpret results.

9

9

9

9

9

9

9

c) Design a system or process to meet a need

9

9

9

9

d) Function on multi-discipline teams e) Identify, formulate, and solve problems

9

9

9

9

9

9

9

9

f) Understand professional and ethical responsibilities

9

g) Communicate effectively h) Be broadly educated I) Recognize need for continuing education

9

9

9

9

9

9

9

j) Demonstrate awareness of contemporary societal issues 9

9

9 9

k) Use modern engineering tools

7. Can exercise professional, ethical, and social responsibilities, and engage in life-long learning.

3. Can apply engineering knowledge to design and build processes and systems.

9

b) Design experiments and analyze data

ABET CRITERIA 3 (General)

6. Can function in multidisciplinary teams.

2. Can use state-of-the-art resources to solve engineering problems.

a) Apply knowledge of math, science, and engineering

5. Can communicate effectively (oral and written).

1. Can integrate knowledge of mathematics, science, and engineering to model and analyze problems.

LEARNING OUTCOMES

9

9

9

 Required General Education component, set by the University (http://catalog.arizona.edu/2003-

04/gened.html), emphasizing breadth of knowledge of society and human behavior, including race, gender and ethnicity, and achievement: Tier 1 Individuals and Societies: Tier 2 Individuals and Societies: Tier 1 Traditions and Cultures: Tier 2 Arts or Humanities:

6 units/hours 3 6 3

 Required fundamental foundation courses in mathematics, physics and chemistry, including

courses on advanced engineering analysis and numerical methods:* MATH 125, 129, 223, and 254 (13 units) PHYS 141 and 241 (8 units) CHEM 103/104a and 103b (7 units) AME 301 and 302 (7 units)

 Required basic engineering courses that emphasize the fundamentals of engineering science and

engineering design, including courses in civil engineering (CE), electrical and computer engineering (ECE), and material science and engineering (MSE):

*

See Table 3.1.c for course titles.

Page 20

Aerospace Engineering

ENGR 102 (3 units) CE 214 (3 units) ECE 207 (3 units) AME 230, 250, 324a, and 331 (12 units) MSE 331r/l (4 units)  Advanced engineering courses in the AE discipline, emphasizing aerodynamics, gasdynamics,

propulsion, control systems, airplane performance, airplane dynamics control, space systems, and their associated designs: AME 320, 321, 323, 324b, 420, 424, 425, 427, 422 or 428, 455, 463 (33 units)

 Laboratory courses in which students learn to design experiments, use modern instrumentation,

analyze data and interpret results:

AME 300 and 401 (4 units)  Approved technical electives, chosen in consultation with an academic and/or faculty advisor,

with a clear engineering focus: 6 units

 Multidisciplinary and open-ended design projects, often relevant to and sponsored by industry,

subject to realistic constraints:

AME 420 and 422 (6 units) and Independent Study (AME 299, 399, 499)  Communication-intensive courses in which oral and written presentation skills are developed and

practiced:

Engl 101 and 102 (6 units) and AME 401, 420 and 422 (7 units) Table 3.1.c maps the required AE courses into the learning outcomes by a relevance score determined by the faculty teaching the courses. The preponderance of H’s (= high) and M’s (= medium) reinforces the fact that the courses are well-structured in support of the program outcomes. Detailed course syllabi (for both required and elective courses) are given in Appendix I.B. and the sequence of required courses is given in Appendix I.A., Table I.1. The recommended timetable for taking the courses is given in Table 3.1.d. 3.2.

Assessment Process

The Department of Aerospace and Mechanical Engineering has established several specific and regularly scheduled assessment cycles for measuring outcomes, reviewing its educational objectives and outcomes, and identifying the needs of its constituencies. The assessment has “internal” and “external” components. These should be obvious from the context of the descriptions below. Figure 3.2.a is a flowchart describing the AME continuous improvement process. This flowchart describes three basic assessment cycles. As described below, with one exception (Undergraduate Advisory Committee), all elements of this flowchart have been used throughout the course of the past six years. Furthermore, the overall structure, flow of information and functional elements were in place and operational. It is also true that it is only within the past year that this structure has been formally identified and used to further refine the continuous improvement process within the Department of Aerospace and Mechanical Engineering. This has been a valuable result of the department’s commitment to improvement inherent in the EC2000 process.

Aerospace Engineering

Page 21

Table 3.1.c AE courses and learning outcomes.

2. Can use state-of-the-art resources to solve engineering problems.

3. Can apply engineering knowledge to design and build processes and systems.

4. Can plan experiments, analyze data, and interpret results.

5. Can communicate effectively (oral and written).

6. Can function in multidisciplinary teams.

H H M H H M H H H H H M M M H H H H H L M H H M

M H M M H H H M M M NA M M M H L M H M H M L NA NA

M M M L M H H H M H L NA H H H H H H H H M NA NA NA

NA NA H L NA M NA NA H H L H M M L NA L M L NA M L NA H

NA L H M L M-L M M L M M H H H H L M L M L H NA NA M

NA NA L L M M-L H L L NA NA NA H H L NA H H NA NA H NA NA M

NA M L M L L M L M M M L H H M L M M NA NA H

H

NA

NA

NA

NA

NA

NA

H NA NA M H H

M NA NA L NA NA

M NA NA H NA NA

M NA NA H NA NA

M H H H NA NA

M L L H NA NA

NA L L H NA NA

7. Can exercise professional, ethical, and social responsibilities, and engage in life-long learning.

1. Can integrate knowledge of mathematics, science, and engineering to model and analyze problems.

LEARNING OUTCOMES*

Required Courses: AME 230, Thermodynamics AME 250, Dynamics AME 300, Instrumentation Laboratory AME 301, Engineering Analysis AME 302, Numerical Methods AME 320, Aerodynamics AME 321, Aircraft Performance AME 323, Gasdynamics AME 324a, Mechanical Behavior of Engineering Materials AME 324b, Engineering Component Design AME 331, Introduction to Fluid Mechanics AME 401, Senior Aerospace Laboratory AME 420, Aircraft Conceptual Design AME 422, Aerospace Engineering Design AME 424, Introduction to Space Technologies AME 425, Aerospace Propulsion AME 427, Stability and Control of Aerospace Vehicles AME 428, Space Mission Conceptual Design AME 455, Control System Design AME 463, Finite Element Analysis with ANSYS AME 495s, Senior Colloquium CE 214, Statics CHEM 103a, Fundamental Techniques of Chemistry CHEM 104a, Fundamental Techniques of Chemistry Lab. CHEM 103b, Fundamentals of Chemistry or MSE 110, Solid State Chemistry ECE 207, Elements of Electrical Engineering ENGL 101, First Year Composition ENGL 102, First Year Composition ENGR 102, Problem Solving and Engineering Design MATH 125, Calculus I MATH 129, Calculus II

NA NA NA

(continued on next page)

Page 22

Aerospace Engineering

Table 3.1.c—Continued.

3. Can apply engineering knowledge to design and build processes and systems.

4. Can plan experiments, analyze data, and interpret results.

5. Can communicate effectively (oral and written).

6. Can function in multidisciplinary teams.

7. Can exercise professional, ethical, and social responsibilities, and engage in life-long learning.

Technical Elective courses are chosen by the student in consultation with a faculty advisor. At least 3 units must be at the 400 level in AME. English 308 may be taken to satisfy the MCWA requirement, if necessary, or 3 units of AME 499, Independent Study, may be taken to complete the technical elective requirements. Students are strongly encouraged to take at least one course with design content (given in [brackets]).

2. Can use state-of-the-art resources to solve engineering problems.

MATH 223, Vector Calculus MATH 254, Introduction to Ordinary Differential Equations MSE 331L, Fundamentals of Materials for Engineers MSE 331R, Engineering Materials Laboratory PHYS 141, Introductory Mechanics PHYS 241, Introductory Electricity and Magnetism Technical Electives:

1. Can integrate knowledge of mathematics, science, and engineering to model and analyze problems.

LEARNING OUTCOMES*

H H M H H H

NA NA L L NA NA

NA NA M NA NA NA

NA NA H NA H H

NA NA M NA M M

NA NA L NA M M

NA NA NA NA NA NA

H

H

H

H

H

H

H

*H = high; M = medium; L = low; NA = not applicable.

Aerospace Engineering

Page 23

Table 3.1.d Recommended time table for completion of courses.a

a

Over the past 6-year period, several minor changes have been made as follows: Courses have been added to the list of Technical Electives. Math 125a is now Math 125 and Math 125b is Math 129 (same courses, different numbers). AME 324 and CE 217 have been replaced by AME 324a and AME 324b. AME 461 has been replaced by AME 463. HSS Electives are now referred to as Tier 1 and Tier 2 courses.

Page 24

Aerospace Engineering

Aerospace Engineering

Page 25

Figure 3.2.a. Flowchart of the AME continuous improvement process.

The most frequent cycle (bottom) occurs every semester. Every course is involved. Students are assessed using such vehicles as exams, projects, and written and oral reporting (see Figure 3.2.a). Examples of such student work will be made available during the ABET visit. Faculty members are assessed by the students every semester using a University of Arizona course/faculty evaluation survey. Comprehensive records are available for review during the ABET visit. The results are provided to the faculty for each course they teach. The Department Head also receives the results and uses them as part of the faculty member’s annual evaluation. Students also contact the Department Head in some cases. Individual faculty members are also assessed by their peers or by university staff—although this does not occur every semester. Discipline-specific faculty subgroups may provide feedback to the Undergraduate Studies Committee or the subgroup may pro-actively initiate action on one or more courses. While the Undergraduate Studies Committee is typically quite active (meetings once or twice per month), the discipline-specific faculty subgroups are not normally very active at this level of assessment. The Undergraduate Studies Committee will bring course and curricular matters to the attention of the entire faculty for review by way of Faculty Meetings or Annual Retreats. In the recent past Annual Retreats have not been held but the Undergraduate Studies Committee will typically bring course and/or curricular matters to the attention of the entire faculty for review and approval as necessary in Faculty Meetings. Feedback on the design aspects of the curriculum are obtained each semester in the form of Senior Design project evaluations performed by industry judges. The next most frequent cycle (middle of Figure 3.2.a) occurs every year. Changes of individual courses approved during the most frequent assessment cycle are carried into the annual curriculum assessment process. The curriculum is assessed annually using a variety of vehicles targeting various constituencies (see Figure 3.2.a). These include vehicles to solicit input from our students (AME Senior Exit Survey), our alumni (Alumni Survey), and industry (Industrial Advisory Council). These aspects of the assessment process are well-established and exercised routinely. The Undergraduate Laboratory Committee has responsibility for the vitality of the undergraduate teaching laboratories. The Undergraduate Laboratory Committee has developed a plan for maintaining and upgrading undergraduate teaching laboratories. In the past the data obtained from these vehicles were channeled to the Department Head, who then brought them to the attention of the Undergraduate Studies Committee and/or to the entire faculty at Faculty Meetings for discussion and approval. There are two elements of this cycle in the “Other Constituents” column that warrant comment. The first represents a transition and the second is an addition. In response to the most recent Academic Program Review (2001), the role of the former Industrial Advisory Council (IAC) has been broadened by the formation an AME Advisory Board. While industry is still strongly represented, the new board also includes academics and alumni. Secondly, a new Undergraduate Advisory Committee will be appointed in fall semester 2004. This committee will be comprised of undergraduates who represent the diverse activities and interests of all four years of the student body. It is expected that it will meet with the Department Head and the Advisory Committee (faculty representatives and Associate Department Heads) at least once per year. Meetings with other constituents are likely (Undergraduate Studies Committee, Faculty Meeting, etc.) to address particular objectives. The committee will strengthen the assessment process by providing a forum for on-going discussion with students that will complement the Senior Exit Survey and the course evaluation input. In particular, this committee will provide “continuous” discussion between a wider spectrum of students (not just seniors) and the faculty concerning a wide spectrum of issues—not limited to individual course evaluations or items specified on the Senior Exit Survey. The least frequent cycle (top of Figure 3.2.a) occurs every three years (every seven years for Academic Program Review). During the last two of the three year cycles (the period of this ABET review), the elements of the assessment process were as described in Figure 3.2.a. Curriculum changes approved during the annual curriculum assessment processes are carried into this cycle of assessment. Data from Performance Evaluation surveys (every 3 years) providing input from employers concerning the performance of our alumni are included. In the past an ABET Committee was formed one to two years before an ABET visit. This committee worked with the Undergraduate Studies Committee, discipline-specific faculty subgroups (as necessary), the Industrial Advisory Council, and the most recent Academic Program Review report to develop recommendations for

Page 26

Aerospace Engineering

program changes. These changes were discussed and approved by the faculty. This process has led to modifications of the Learning Outcomes and Educational Objectives of the department. In conclusion, the assessment process described in Figure 3.2.a has been in place and functional during the past six years, although aspects of it continue to evolve—as would be expected in the process of continuous improvement. In the past the Undergraduate Studies Committee has played a key role in course and curriculum assessment—particularly since this committee is continuously involved in undergraduate curriculum matters throughout the year. However, this committee has not been explicitly defined as the focal faculty committee to oversee the continuous assessment process within the department. Beginning in the fall semester of 2004 several key changes will be made to strengthen the continuous improvement process further. The Undergraduate Studies Committee will be charged as the focal faculty committee to oversee the assessment process within the department. A member of the committee will be designated as the coordinator of assessment activities. The committee will be asked to conduct its business to explicitly optimize the impact of its activities on the Educational Objectives and Learning Outcomes of the department at all times. All data obtained from the range of assessment vehicles will be directed to the Undergraduate Studies Committee. The committee will be asked to review the data, interpret it, and bring recommendations to the faculty for consideration at a faculty meeting at least once each year. The principal instruments (both quantitative and qualitative) used in assessment are described below, along with an assessment process matrix that maps the assessment tools to the Learning Outcomes. The principal instruments (both quantitative and qualitative) used in this regard are:  Senior Exit Survey: conducted once each year.  AME Letter of Solicitation: every three years; to request input from alumni.  Alumni Survey: conducted by the College of Engineering: every year.  Evaluation of Senior Design Projects by Judges from Industry: every semester.  Student Course/Instructor Evaluations: conducted every semester in every course according to

University policy.

 Fundamentals of Engineering Examination: periodic tabulation of the results.  Review and Assessment by Industrial Advisory Council (and new AME Advisory Board):

annually as needed, especially in response to their needs.

 Performance Assessment from Industry: every three years; to learn about the performance of our

graduates.

 Faculty (Undergraduate Studies Committee) Assessment of Curriculum: conducted periodically

as needed, but parts of the curriculum are reviewed every year (see Attachment 3.2.a for minutes from a recent committee meeting).

 Academic Program Review: at least once every seven years, mandated by the University. The

report is transmitted to the Arizona Board of Regents.

 Job Placement Data (Academic Services, College of Engineering): Data are collected from pre-

commencement programs, departments, and faculty, from employers who advertise through Academic Services, and from the CareersEng listserv survey conducted by Academic Services. Periodic examination and interpretation of data.

Aerospace Engineering

Page 27

Table 3.2.a presents data on the mapping of the assessment tools into the learning objectives. The assessment tools completely cover all the learning outcomes. Table 3.2.b summarizes the assessment of the program outcomes via each of the mechanisms above and the relative importance of each instrument (H=high, M=medium, L=low) in support of assessment as defined by the AME ABET Committee. The instruments are described in the following and interpretations of the data are given in the referenced sections. The sample size (n) is also defined for each instrument. Senior Exit Survey: In the AE program, this survey has been administered every other semester in the capstone design class. Starting Fall 2004, it will be administered every semester in the Senior Colloquium (AME 495s). A sample survey is given in Attachment 3.2.b. It is our belief that this survey is a very valuable assessment tool. Student experiences with faculty, other personnel, and the infrastructure are especially important for the closure of the assessment feedback loop. These data are made available to the Undergraduate Studies Committee and the Department Head for their actions to improve the program. Table 3.2.a Assessment process matrix: assessment tools to learning outcomes.

2. Can use state-of-the-art resources to solve engineering problems.

3. Can apply engineering knowledge to design and build processes and systems.

4. Can plan experiments, analyze data, and interpret results.

5. Can communicate effectively (oral and written).

6. Can function in multidisciplinary teams.

X

X

X

X

X

X

AME Letter of Solicitation (every 3 years)

X

X

X

X

X

X

X

Alumni Survey (every year)

X

X

X

X

X

X

X

Evaluation of Senior Design Projects by Judges from Industry (every semester)

X

X

X

X

X

Student Course/Instructor Evaluations (every semester)

X

X

X

X

X

Fundamentals of Engineering Examination (periodically)

X

X

Review and Assessment by Industry Advisory Council (annually)

X

X

X

X

X

X

X

Performance Assessment from Industry (every 3 years)

X

X

X

X

X

X

X

Faculty (Undergraduate Studies Committee) Assessment of Curriculum (continuously—annual review)

X

X

X

X

X

X

Academic Program Review (every 7 years)

X

X

X

X

X

X

Job Placement Data (annually)

X

X

X

X

X

X

MEASUREMENT TOOLS USED TO ASSESS OUTCOMES Senior Exit Survey (every semester) a

a

7. Actions guided by professional and ethical attitudes, social responsibility, and life-long learning.

1. Can integrate knowledge of mathematics, science, and engineering to model and analyze problems.

LEARNING OUTCOMES

X

Anecdotal information about educational experience.

Page 28

Aerospace Engineering

Table 3.2.b Relative importance of assessment instruments. Assessment Instrument Senior Exit Survey (every semester) AME Letter of Solicitation (every 3 years) Alumni Survey (every year) Evaluation of Senior Design Projects by Judges from Industry (every semester) Student Course/Instructor Evaluations (every semester) Fundamentals of Engineering Examination (periodically) Review and Assessment by Industrial Advisory Committee (annually) Performance Assessment from Industry (every 3 years) Faculty (Undergraduate Studies Committee) Assessment of Curriculum (continuously—annual review) Academic Program Review (every 7 years) Job Placement Data (annually)

N 85 64 97 8* 7789 33 9 7 na

Importance H M M H M M M H H

na 34

H M

*8 projects/12 judges. The survey has three parts: the first part rates the overall quality of the facilities and that of the faculty, teaching assistants and staff. Two additional activities are also rated, namely, advising and the engineering design experience. A summary of the data for the past 6 years is presented in Section B.3.2.1.1. The second part of the survey directly focuses on ABET Criteria 3(a)-(k). Students rate the program on a scale of 7 (extremely well) to 1(not at all, including NA=not applicable) for each of the criteria. A summary of the data for the past 6 years is presented in Section B.3.2.1.2. On the third and final part, the students are asked to select three of the ABET criteria that they perceive will be the most important in their careers. A summary of the data for the past 6 years is presented in Section B.3.2.1.3. AME Letter of Solicitation: AME alumni were contacted in 2001 (~4000 letters) to request feedback regarding their educational experience and what they are doing at the present time professionally or in furthering their education (64 responded). Although their written comments are anecdotal in nature, they were shared with the IAC, and they formed the basis for changes in the curriculum when approved by the Undergraduate Studies Committee. A sample letter is given in Attachment 3.2.c, and a discussion of the feedback received from the alumni is presented in Section 3.2.2. The actual comments are also included in Attachment 3.2.c. Alumni Survey: These are conducted by the College of Engineering every year (individual students are contacted every other year). The surveys are directed at alumni with three and five years of experience following graduation. A sample survey is given in Attachment 3.2.d. The data from the 2003 survey are presented in Section B.3.2.3. The survey not only provides a perspective on the students’ educational experience, but the first page of the survey also yields important information on their professional accomplishments and career development activities. The alumni survey asks how well the department has prepared them for industrial settings: multidisciplinary teamwork, communication skills, lifelong learning, ethical responsibilities, and problem and design formulation. These feedbacks are compiled and tabulated. The Department Head and the members of the Undergraduate Studies Committee review and discuss the results. The College also makes use of the assessment from this survey in the College ABET Committee and the College Undergraduate Studies Committee to identify patterns among departments and variations among alumni perceptions and input.

Aerospace Engineering

Page 29

Evaluation of Senior Design Projects by Judges from Industry: In their final year, undergraduate students are required to take a series of two capstone design courses: AME 420 and AME 422 or 428. Many of the student teams interact directly with local companies, and design and deliver a product at the end of the year. Representatives from industry act as judges as part of the final evaluation process. Since Spring 1999, their input has been garnered through an evaluation form addressing difficulty of design, creativity, quality of design and hardware, level of analysis, and effectiveness of presentation. A sample of the Senior Design Project evaluation form is presented as Attachment 3.2.e and the responses are discussed in Section B.3.2.4. Student Course/Instructor Evaluations: This is done via standard questionnaires available from the office of Assessment and Enrollment Research of the University. The mandatory survey is filled out by students near the end of each semester for each course, and the information is processed by the University. Options are available for modifying the survey to meet the specific educational objectives of a course. Most conventionally, however, the so-called short form is used. A sample form (short) is given in Attachment 3.2.f. The information provided by the course evaluations is used in two different ways. First, the pure numerical scores of the evaluation provide important comparative information regarding course content, method of delivery, adequacy of the course materials, etc., which are to be carefully examined by the professor in the semester following the course offering. These data are also useful for the purpose of evaluating each faculty member teaching a given course. Second, the written comments provided by students in the course evaluations provide important feedback on method and style of delivery, adequacy of materials, etc., which cannot be sufficiently determined solely from the tabulated numerical scores. Admittedly, these written comments are anecdotal in nature and are carefully screened to isolate extreme views (in either direction). For this reason, such information is useful for the faculty member in a formative sense, but is not appropriate for use in formal evaluation of the faculty member’s adequacy as a teacher. The instructors and Department Head receive computerized interpretations of the results on various metrics of teaching effectiveness. A summary of the data for the past 5½ years is given in Section B.3.2.5. Fundamentals of Engineering Examination (FE): Some of our graduates take this examination in anticipation of PE certification in later years. The FE examination is not a graduating requirement, but it is a nationally-normed examination that provides useful information, both on the quality of the program as well as on the students who take them. The database is small but favorable; there is no specific examination for the aerospace discipline so quantitative feedback is limited to the common engineering disciplines. The data are presented in Section B.3.2.6. Review and Assessment by Industrial Advisory Council (and new AME Advisory Board): The Industrial Advisory Council provided a vehicle for industry input to the periodic evaluation of program educational objectives and the efforts to continuously improve the program. The Industrial Advisory Council (IAC) consisted of members from national and local companies. The IAC met twice a year on a regular basis to discuss the educational objectives of the department, the curriculum, and strategies to improve the department. When an important issue arose, a subcommittee was formed to examine and recommend a solution to the IAC. Such recommendations were forwarded to the Department Head and the Undergraduate Studies Committee. The new AME Advisory Board will perform the same functions and conduct itself in a similar, but broader, manner as the previous IAC. It will provide program review from the industry perspective, as well as program review in a broader context. It will meet annually (more often if necessary). Performance Assessment from Industry: Another measure of the success of the Aerospace Engineering Program is the on-the-job performance of the graduates working in industry. Representative employers of our graduates are requested to fill out a performance survey every three years. The survey specifically requests information on our graduates in the areas of technical ability, communication and professional growth, and eagerness to engage in life-long learning. Both the Department Head and the Undergraduate

Page 30

Aerospace Engineering

Studies Committee review the feedback. A copy of the Performance Assessment Survey is presented as Attachment 3.2.g. Results of the 2001 and 2004 surveys are presented in Section B.3.2.8. Faculty (Undergraduate Studies Committee) Assessment of Curriculum: It is believed that the faculty have the clearest insights into the program. Curriculum (i.e., program) issues are discussed by the Undergraduate Studies Committee and recommendations are brought to the faculty for discussion at faculty meetings, e.g., introduction of MatLab in AME 302 and addition of Finite Element Analysis with ANSYS to the curriculum. With respect to assessment at the course-level, it is the faculty, via their day-to-day contact with students, who have the best understanding of how well the learning outcomes are met. The assessment is done in conventionally accepted ways using prerequisite quizzes, examinations, projects and homework. Information on the first is provided to the Undergraduate Studies Committee and the Department Head. The use of the prerequisite quiz is not universal, while the other instruments are standard. The University also provides grade statistics for each course at semester’s end for review by the Department Head. He/she can request input/action from the Undergraduate Studies Committee if he/she feels that the program outcomes are not met. A future objective is to construct a short survey that instructors can use in each course to provide an overall assessment on how well the students met the applicable program outcomes. As we all know, grades alone can be misleading because of the different levels of course difficulty and instructor standards (see Section B.3.2.9). Academic Program Review: The review consists of a self-study report and the report/recommendations of the review committee. The last review was carried out during the Spring of 2001. The committee consisted of three distinguished academicians (NAE members), members from industry, alumni, and a College representative (7 total). The final report of the committee is available upon request; the self-study report is available from the Department. This is a comprehensive review of the department, including the graduate program and research activities. The report was very favorable; it was submitted to the provost for his action. See Section B.3.2.10 for a discussion. Job Placement Data: Data are collected from pre-commencement programs, departments, and faculty, from employers who advertise through Academic Services, and from the CareersEng listserv survey conducted by Academic Services. Table 3.2.c presents employment data for students graduating from August 1997 through May 2002. The sampling size for AE is 34 students. Table 3.2.c Number of degreed AME students who voluntarily reported employment (undergraduate students and those authorized to work in US).a Class Year

Degrees Employment % Awarded Reported Feedback AE 21 8 52 2002 ME 75 25 40 AE 19 6 47 2001 ME 57 22 54 AE 17 4 47 2000 ME 74 43 66 AE 12 6 50 1999 ME 73 41 58 AE 18 10 72 1998 ME 86 55 84 a Class year includes Aug. and Dec. of preceding year and May of year given. Employment reported does not include graduate school, returning to home country, etc.; feedback does.

Aerospace Engineering

Major

Page 31

3.2.1.

Senior Exit Survey

A copy of the Department of Aerospace and Mechanical Engineering’s Senior Exit Survey for students near graduation* is provided as Attachment 3.2.b. Data for the AE major are available for Fall 1998, Fall 1999, Spring 2000, Spring 2001, Spring 2003, and Spring 2004, with a total sample size of n = 85. Data are not available for Spring 2002. Analyses were performed and are reported in the following sections along with some interpretation. 3.2.1.1. Overall Educational Experience

A complete set of responses to the question “Please rate your overall educational experience in AME” for eight groupings is provided as Attachment 3.2.h for the period Fall 1998-Spring 2004. A summary of the statistics in the form of a bar chart is provided in Figure 3.2.b. The questions concentrated on the Engineering Design Experience, Computer Labs (hardware and software), Physical Labs, AME Faculty, AME Teaching Assistants, AME Office Staff, AME Shop Staff, and AME Advising. General Comments: The responses indicate that the majority of students rate their educational experience as “good” or “excellent” in every category. Indeed, 75% to 82% of students considered Design, Computer Labs, Faculty, Office Staff, and Machine Shop Staff as “good” or “excellent.” Approximately 70% rate Advising and Physical Labs as “good” or “excellent,” and approximately 60% rate TAs as “good” or “excellent.” Another way to identify areas where improvement can be made is to examine areas rated as “poor” by students. Overall, only about 5% of students rated the overall program as “poor.” The area that has the highest “poor” rating (about 11%) is Computer Labs; this is balanced by a larger percentage of students who consider it “excellent.” The areas of Advising and TAs warrant attention on the basis of “poor” ratings. Excellent

Good

Fair

Poor

55

52

51 47

44

45

Numberof Responses

41

41

40 37

35

29

25

25

24

23

22

19

18

17

16 15

12

13

12 9

18 14

13

9

8

7 5

4 2

Eng. Desi gn

Computer Labs

Physi cal Labs

2

Facul ty

2

TA's

Of f i ce Staf f

1

Shop Staf f

Advi si ng

-5

Figure 3.2.b. Ratings of overall educational experience in Aerospace Engineering (totals for Fall 1998, Fall 1999, Spring 2000, Spring 2001, Spring 2003, and Spring 2004). Sample size n = 84.

*

Starting Fall 2004, the survey will be handed out in the Senior Colloquium (AME 495s), where both Aerospace and Mechanical students will be surveyed each semester. The results identify strengths and weaknesses in the program. They also have been used to determine awards to faculty and staff for their contributions to the educational mission of the department.

Page 32

Aerospace Engineering

3.2.1.2. How the Program Has Taught You

A complete set of the ratings of Criteria a-k is provided as Attachment 3.2.i. A summary of the statistics in the form of a bar chart is provided in Figure 3.2.c. The ABET criteria are: a. b. c. d. e. f. g. h. i. j. k.

Apply mathematics, science, and engineering principles Design and conduct experiments and interpret data Design a system, component, or process to meet desired needs Function on multidisciplinary teams Identify, formulate, and solve engineering problems Understand professional and ethical responsibility Communicate effectively Understand the impact of engineering solutions in a global context Recognize the need for and to engage in life-long learning Know contemporary issues Use the techniques, skills, and modern engineering tools necessary for engineering practice

General Comments: Most of the responses are at or above 4.7 (out of 7), which suggests that the students perceive the program performance as being good. The qualitative aspects of the program, such as impact of engineering in a global context (h), life-long learning (i), and contemporary issues (j), could be improved, though the rating is still strong (around 4.5 on average). There are no other indications suggesting that serious remedial action is necessary at this time. 3.2.1.3. The Three Most Important Criteria

Students were asked to identify three criteria, from a-k, that they considered would be the most important in their careers. The complete set of responses is provided as Attachment 3.2.j. A summary of the statistics in the form of a bar chart is provided in Figure 3.2.d. General Comments: Those items that students judged to be most important to their success, in order of importance, are: (g) communicate effectively; (d) function on multidisciplinary teams; and (a) apply mathematics, science and engineering principles. The evaluations for these three items are shown in Figure 3.2.c and are in the relatively high range, 4.6-5.4.

7 = extrem ely w ell; 1 = no t at all; N A = n o t app licab le

7.0

6.0

5.5

5.4 5.0

4.7

4.7

4.9 4.6

4.7 4.4

4.8

4.6

4.5

i

j

4.0

3.0

2.0

1.0

0.0 a

b

c

d

e

f

g

h

k

ABET Criterion

Figure 3.2.c. Evaluations of how aerospace students were taught (averages for Fall 1998, Fall 1999, Spring 2000, Spring 2001, Spring 2003, and Spring 2004).

Aerospace Engineering

Page 33

Avg. No. of Times Chosen

6 5

4.8

4

4 3

2.8 2.5

2

2.5

2.3 1.8

1.7 1.3

1

0.8

0.7

0 a

b

c

d

e

f

g

h

i

j

k

ABET Criterion

Figure 3.2.d. Criteria chosen as most important by aerospace students (averages for Fall 1998, Fall 1999, Spring 2000, Spring 2001, 2003, and Spring 2004).

3.2.2.

AME Letter of Solicitation

The majority of our alumni practice in the engineering profession, and their overall satisfaction with the engineering education they received is, in general, rather positive. Their primary suggestion was to enhance/add more application-meaningful courses for hands-on experience either in the physical or computer laboratories. The majority of them mentioned their senior design projects as one of their most memorable experiences and as good preparation for real-world experiences. As to their responsibilities and achievements as engineers, they seem to cover a large spectrum of functions varying from low-level to management-level positions. The majority of those who responded to our solicitation were generous with their financial contributions to AME. Although not a measure, this is also indicative of their satisfaction. 3.2.3.

Alumni Survey (by College of Engineering—COE*)

Aerospace Engineering B.S. graduates with up to five years of experience were sent a letter and asked to complete a comprehensive survey relating to the performance of the AME program. A copy of the survey forms is provided in Attachment 3.2.d. A total of 95 AME alumni responded; these include both Aerospace and Mechanical Engineering graduates (we have not separated them because the number of Aerospace respondents is small). Analyses were performed and are reported in the following sections, along with some interpretation. 3.2.3.1. How Satisfied Were You With Your Education? (program performance relative to Criterion 3; histograms of responses)

Histograms of the numerical scores for the responses to the specific question “How satisfied were you with your education in the COEM at The University of Arizona in helping your ability to . . .” are *

Formerly the College of Engineering and Mines (COEM).

Page 34

Aerospace Engineering

given in Figure 3.2.e. A summary of the statistics for 2001-2003 is provided in Attachment 3.2.k, including the verbal comments. General Comments: The mean of the responses is comfortably above the mid-range value of 3.0 for all items, which suggests that the performance of the program is strong for all items, especially in the application of mathematics and physics. It is concluded that no remedial action is necessary at this time. 3.2.3.2. How Did the Program Enhance Your Abilities

Histograms for the question “To what degree did your engineering education enhance your ability to . . . ?” are given in Figure 3.2.f. A summary of the statistics for 2001-2003 is provided in Attachment 3.2.k. General Comments: The mean of the responses is comfortably above the mid-range value of 3.0 for all items, which suggests that the performance of the program is strong for all items. Some weakness, as in the AME Exit Survey, is detected regarding “impact of engineering solutions in a global context.” The program is rated highly in the analysis and interpretation of data, in the formulation and solution of engineering problems, and in communication and design. 3.2.3.3. Design Experience

Histograms of the numerical scores for the responses to the specific question “To what degree did your design experience at the University . . .” are given in Figure 3.2.g. A summary of the statistics for 2001-2003 is provided in Attachment 3.2.k. General Comments: The mean of the responses is comfortably above the mid-range value of 3.0 for the technical items, which suggests that the performance of the program is reasonable. Some weakness is detected on “economic,” “environmental,” “safety,” and “socio-political” issues. This is a valid observation by the students, i.e., we do not address well these very broad issues in our technical courses. We look to the new AME Advisory Board for suggestions—possibly speakers from industry to highlight some of these issues using case studies. A similar strategy will be used to address the impact of engineering solutions in a global context (see Section B.3.2.3.2). 3.2.3.4. Laboratory Experience

Histograms of the numerical scores for the responses to the specific question “To what degree did laboratory experiences at the University . . .” are given in Figure 3.2.h. A summary of the statistics for 2001-2003 is provided in Attachment 3.2.k. General Comments: The mean of the responses is comfortably above the mid-range value of 3.0 for all items, which suggests that the performance of the program is strong. The laboratories correlate well with lecture courses.

Aerospace Engineering

Page 35

Apply Mathem atics

No. of Responses

50

Average: 4.04 Standard Deviation: 1.00 Applicable Responses: 95

45 40 35 30 25 20 15 10 5 0 5

4

3

2

1

Apply Physics

No. of Responses

60 50

Average: 4.13 Standard Deviation: 0.92 Applicable Responses: 95

40 30 20 10 0 5

4

3

2

1

Apply Chem istry

No. of Repsonses

60

Average: 3.18 Standard Deviation: 0.95 Applicable Responses: 95

50 40 30 20 10 0 5

4

3

2

1

Understand Contem porary Issues

Average: 3.44 Standard Deviation: 1.06 Applicable Responses: 95

No. of Responses

45 40 35 30 25 20 15 10 5 0 5

4

3

2

1

Figure 3.2.e Histograms for the question “How satisfied were you with your education in the COEM at The University of Arizona in helping your ability to . . . ?” (5 = high, 3 = medium, 1 = low).

Page 36

Aerospace Engineering

Analyze and Interpret Data

No. of Responses

45 40 35

Average: 4.28 Standard Deviation: 1.11 Applicable Responses: 94

30 25 20 15 10 5 0 5

4

3

2

1

Design Experim ents

No. of Responses

40 35 30

Average: 3.69 Standard Deviation: 1.21 Applicable Responses: 94

25 20 15 10 5 0 5

4

3

2

1

Conduct Experim ents

No. of Responses

40 35 30

Average: 3.78 Standard Deviation: 1.24 Applicable Responses: 93

25 20 15 10 5 0 5

4

3

2

1

Function on Multidisciplinary Team s

No. of Responses

40 35 30

Average: 3.92 Standard Deviation: 1.29 Applicable Responses: 95

25 20 15 10 5 0 5

4

3

2

1

Figure 3.2.f Histograms for the question “To what degree did your engineering education enhance your ability to . . . ? (5 = high, 3 = medium, 1 = low).

Aerospace Engineering

Page 37

Figure 3.2.f—Continued. Form ulate Engineering Problem s

No. of Responses

50 45

Average: 4.06 Standard Deviation: 1.12 Applicable Responses: 94

40 35 30 25 20 15 10 5 0 5

4

3

2

1

Solve Engineering Problem s

No. of Responses

60 50

Average: 4.20 Standard Deviation: 1.05 Applicable Responses: 94

40 30 20 10 0 5

4

3

2

1

Understand Ethical Responsibilities

No. of Responses

40 35

Average: 3.78 Standard Deviation: 1.12 Applicable Responses: 95

30 25 20 15 10 5 0 5

4

3

2

1

Understand Im pact of Eng. Solutions

No. of Responses

40 35

Average: 3.31 Standard Deviation: 1.30 Application Responses: 93

30 25 20 15 10 5 0 5

Page 38

4

3

2

1

Aerospace Engineering

Figure 3.2.f—Continued. Com m unicate via Oral Reports

No. of Responses

45 40

Average: 4.15 Standard Deviation: 1.18 Applicable Responses: 94

35 30 25 20 15 10 5 0 5

4

3

2

1

Com m unicate via Written Reports

No. of Responses

40 35

Average: 4.10 Standard Deviation: 1.15 Applicable Responses: 94

30 25 20 15 10 5 0 5

4

3

2

1

Need for Life-Long Learning

No. of Responses

40 35

Average: 4.01 Standard Deviation: 1.29 Applicable Responses: 93

30 25 20 15 10 5 0 5

4

3

2

1

Design to Meet a Need

No. of Responses

40 35

Average: 4.25 Standard Deviation: 0.91 Applicable Responses: 92

30 25 20 15 10 5 0 5

Aerospace Engineering

4

3

2

1

Page 39

Build on Know ledge

No. of Responses

50 45

Average: 3.89 Standard Deviation: 1.20 Applicable Responses: 92

40 35 30 25 20 15 10 5 0 5

4

3

2

1

Incroporate Engineering Standards

No. of Responses

40 35

Average: 3.35 Standard Deviation: 1.24 Applicable Responses: 94

30 25 20 15 10 5 0 5

4

3

2

1

Address Econom ic Issues

No. of Responses

45 40

Average: 2.90 Standard Deviation: 1.19 Applicable Responses: 93

35 30 25 20 15 10 5 0 5

4

3

2

1

Address Environm ental Issues

No. of Responses

45 40

Average: 2.73 Standard Deviation: 1.13 Applicable Responses: 94

35 30 25 20 15 10 5 0 5

4

3

2

1

Figure 3.2.g Histograms for the question “To what degree did your design experience at the university . . . ?” (5 = high, 3 = medium, 1 = low).

Page 40

Aerospace Engineering

Figure 3.2.g—Continued. Address Health and Safety Issues

No. of Responses

35 30

Average: 2.74 Standard Deviation: 1.17 Applicable Responses: 95

25 20 15 10 5 0 5

4

3

2

1

Address Socio/Political Issues

No. of Responses

40 35 30

Average: 2.52 Standard Deviation: 1.13 Applicable Responses: 94

25 20 15 10 5 0 5

4

3

2

1

Use Techniques, Skills, and Tools

No. of Responses

40 35 30

Average: 3.74 Standard Deviation: 1.26 Applicable Responses: 93

25 20 15 10 5 0 5

Aerospace Engineering

4

3

2

1

Page 41

Correlate w ith Lecture Courses

No. of Responses

60 50

Average: 3.82 Standard Deviation: 1.13 Applicable Responses: 93

40 30 20 10 0 5

4

3

2

1

Learn to Use Modern Tools

No. of Responses

40 35

Average: 3.44 Standard Deviation: 1.30 Applicable Responses: 93

30 25 20 15 10 5 0 5

4

3

2

1

Enhance Basic Understanding

No. of Responses

50 45

Average: 3.72 Standard Deviation: 1.20 Applicable Responses: 93

40 35 30 25 20 15 10 5 0 5

4

3

2

1

Figure 3.2.h Histograms for the question “To what degree did laboratory experiences at the university . . . ?” (5 = high, 3 = medium, 1 = low). 3.2.3.5. Career Needs

A histogram of the numerical scores for the responses to the question “Did your college experience meet your career needs?” is given in Figure 3.2.i. A summary of the statistics for 2001-2003 is provided in Attachment 3.2.k. General Comments: The mean of the responses is comfortably above the mid-range value of 3.0, which suggests that the overall performance of the program is strong for all items. It is possible to conclude that AME students are very satisfied with their educational experience (see also Section B.3.2.3.6).

Page 42

Aerospace Engineering

College Met Career Needs

No. of Responses

45 40 35

Average: 4.02 Standard Deviation: 1.46 Applicable Responses: 89

30 25 20 15 10 5 0 5

4

3

2

1

Figure 3.2.i Histogram for the question ““Did your college experience meet your career needs?” (high = 5, 3 = medium, 1 = low).

Students were asked “What would you have changed about your college experience so as to meet your career needs?” and “What have you learned on the job that should have been included in your formal education?” The anecdotal verbatim comments are tabulated in Attachment 3.2.k. 3.2.3.6. Preparation

Students were asked “Do you feel that at graduation you were adequately prepared for (1) initial career employment and (2) graduate school in your field?” There were a total of 92 and 89 responses, respectively, from alumni. A summary of the statistics for 2001-2003 is provided in Attachment 3.2.k. General Comments: A majority of AME students felt that they were adequately prepared for career employment (92%) and graduate school (90%) in their field. 3.2.3.7. Questions Specific to AME on the COEM Survey

Students were asked seven questions specific to the AME department: 1. To what degree did the academic climate in AME encourage or permit you to: participate in research, participate in independent studies, participate in internships, and participate in extra activities (such as ASME or AIAA projects). 2. How well are you prepared to engage in life-long learning? 3. How important are the general education courses (humanities, etc.) to your education? 4. How important are the general education courses (humanities, etc.) to your career? 5. Rate the academic standards in the AME department. 6. How important would it be to include business, finance, or management courses in the AME curriculum? 7. Please rate your overall educational experience in Aerospace/Mechanical Engineering. Histograms are given in Figures. 3.2.j-p, respectively. A summary of the statistics for 2001-2003 is provided in Attachment 3.2.k. Overall, students were satisfied with the performance of the Department of Aerospace and Mechanical Engineering. It is also clear that students would like to participate more in research, internships, and independent studies and have access to business courses. Again, the AME Advisory Board should provide input on these issues, although a business minor is currently available. The surveys also suggest that advising could be improved. The AME faculty and staff are rated very good and the academic standards are rated good. Aerospace Engineering

Page 43

Participate in Research

No. of Responses

30 25

Average: 3.34 Standard Deviation: 1.23 Applicable Responses: 97

20 15 10 5 0 5

4

3

2

1

Participate in Independent Studies

No. of Responses

30

Average: 3.32 Standard Deviation: 1.15 Applicable Responses: 97

25 20 15 10 5 0 1

2

3

4

5

Participate in Internships

No. of Responses

35 30

Average: 3.33 Standard Deviation: 1.29 Applicable Responses: 97

25 20 15 10 5 0 5

4

3

2

1

Participate in Extra Activities

No. of Responses

40 35

Average: 3.68 Standard Deviation: 1.09 Applicable Responses: 97

30 25 20 15 10 5 0 5

4

3

2

1

Figure 3.2.j Histograms for the question “To what degree did the academic climate in AME encourage or permit you to . . . ?” (5 = high, 3 = medium, 1 = low).

Page 44

Aerospace Engineering

Engage in Life-Long Learning

No. of Responses

45 40

Average: 4.16 Standard Deviation: 0.84 Applicable Responses: 97

35 30 25 20 15 10 5 0 5

4

3

2

1

Figure 3.2.k Histogram for the question “How well are you prepared to engage in life-long learning?” (5 = high, 3 = medium, 1 = low).

Im portance of Gen Ed Courses to Education

No. of Responses

30 25

Average: 3.28 Standard Deviation: 1.14 Applicable Responses: 98

20 15 10 5 0 5

4

3

2

1

Figure 3.2.l Histogram for the question “How important are the general education courses (humanities, etc.) to your education?” (5 = high, 3 = medium, 1 = low).

Im portance of Gen Ed Courses to Career

No. of Responses

30 25

Average: 3.16 Standard Deviation: 1.19 Applicable Responses 97

20 15 10 5 0 5

4

3

2

1

Figure 3.2.m Histogram for the question “How important are the general education courses (humanities, etc.) to your career?” (5 = high, 3 = medium, 1 = low).

Aerospace Engineering

Page 45

Academ ic Standards in AME

No. of Responses

50 45 40 35

Average: 3.90 Standard Deviation: 0.87 Applicable Responses: 97

30 25 20 15 10 5 0 5

4

3

2

1

Figure 3.2.n Histogram for the statement “Rate the academic standards in the AME department” (5= high, 3 = medium, 1 = low).

Business, Finance, or Mgt. Courses?

No. of Responses

50 45

Average: 3.88 Standard Deviation: 1.28 Applicable Responses: 97

40 35 30 25 20 15 10 5 0 5

4

3

2

1

Figure 3.2.o Histogram for the question “How important would it be to include business, finance, or management courses in the AME curriculum?” (5 = high, 3 = medium, 1 = low).

Page 46

Aerospace Engineering

Engineering Design Experience

No. of Responses

60 50

Average: 3.96 Standard Deviation: 0.70 Applicable Responses: 98

40 30 20 10 0 5

4

3

2

1

Com puter Labs

No. of Responses

40 35

Average: 3.48 Standard Deviation: 0.98 Applicable Responses: 98

30 25 20 15 10 5 0 5

4

3

2

1

Physical Labs

No. of Responses

50 45

Average: 3.61 Standard Deviation: 0.91 Applicable Responses: 97

40 35 30 25 20 15 10 5 0 5

4

3

2

1

AME Faculty

No. of Responses

50 45

Average: 4.11 Standard Deviation: 0.89 Applicable Responses: 96

40 35 30 25 20 15 10 5 0 5

4

3

2

1

Figure 3.2.p Histograms for the question “Please rate your overall education in Aerospace/Mechanical Engineering” (5 = high, 3 = medium, 1 = low).

Aerospace Engineering

Page 47

Figure 3.2.p—Continued. AME Teaching Assistants

No. of Responses

40 35 30

Average: 3.49 Standard Deviation: 1.05 Applicable Responses: 96

25 20 15 10 5 0 5

4

3

2

1

AME Office Staff

No. of Responses

50 45

Average: 3.74 Standard Deviation: 1.06 Applicable Responses: 95

40 35 30 25 20 15 10 5 0 5

4

3

2

1

AME Shop Staff

No. of Responses

50 45

Average: 4.08 Standard Deviation: 1.18 Applicable Responses: 95

40 35 30 25 20 15 10 5 0 5

4

3

2

1

AME Advising

No. of Responses

35 30

Average: 3.28 Standard Deviation: 1.39 Applicable Responses: 95

25 20 15 10 5 0 5

Page 48

4

3

2

1

Aerospace Engineering

3.2.4.

Evaluation of Senior Design Projects by Judges from Industry

The AE senior (capstone) design projects are evaluated at the end of each year by a panel of judges drawn from various engineering companies, as well as faculty and instructors involved in teaching and coordinating the course. The design projects are themselves often sponsored by small and large companies and inventors, as well as by AME or other faculty who may want equipment or components designed and built for their research. The success of the design projects was possible, in part, due to the financial support by industry in ”Support of Curriculum Enrichment/Introducing Aerospace Engineering Design Projects into Required Courses.” Some examples of design projects are (sponsors in parentheses):            

Restoration of BEDE-5 Airplane (Pima Air and Space Museum) Smallest Micro Air Vehicle (Raytheon) Deployment of MAV from RC Plane (Raytheon) Ornithopter (Prof. Shkarayev) Design Fuselage with Active Flow Control (Prof. Wygnanski) Design/Build/Fly (AIAA) Adaptive Wing Design (Boeing) Insects Flight (AME Dept.) Control System Simulator (AME Dept.) VTOL Platform (Prof. Wygnanski) Airplane Utilizing Active Flow Control (Prof. Fasel) Deployable UAV (ACR)

The panel of judges, whose numbers have varied over the years, rates the projects in six categories (Attachment 3.2.e) and eventually selects winners (the categories are given in Table 3.2.d). The winners receive monetary prizes funded by various companies. An example of the ratings by the panel of judges is given in Table 3.2.d. In Spring 2003, there were 8 judges who rated the three projects in six categories. The ratings (averaged over the judges) vary considerably from project to project: the “AIAA Design Build Fly” and “Micro Air Vehicle Small” were rated highly in all categories, but especially in the creativity of the design and the degree of difficulty. In general, industry feels that the projects are appropriate and that the activities constituting the design process are done well by the students. The corresponding results for Spring 2004 are given in Table 3.2.e. As seen from the summary, there is considerable variation from semester to semester. The relevant committees of the AME department will have to come to a better understanding of how much of the data is systematic variation so that the capstone design course can be adjusted accordingly. 3.2.5.

Student Course/Instructor Evaluations

The Office of Assessment and Enrollment Research of the University (AER) supplies the TeacherCourse Evaluations (TCE) filled out by students near the end of each course. The guidelines to understanding and interpretation of the survey are formulated in the AER report http://aer.arizona.edu/AER/teaching/Guide/TCEGuide.asp. Most faculty may choose either the Short Form or the Long Form of the TCE questionnaire. The Short Form contains a small core of eleven questions suitable for use in a summative evaluation along with six questions about student demographics. The Long Form contains the same questions plus more specific questions designed to provide detailed feedback. A sample Short Form, which is used by most of the faculty in Aerospace

Aerospace Engineering

Page 49

Table 3.2.d Continuing assessment survey: senior aerospace engineering design projects (Spring 2003). Degree of Difficulty AIAA Design Build Fly Micro Air Vehicle Small Micro-Aerial Vehicle Launch from an RC Plane Totals

1 2 3 5.00 4.00 4.00

4

5

6

7

8

Total 13.00 3.00 4.00 4.00 11.00 3.00 3.00 6.00

Avg 4.33/5.00 3.67/5.00 3.00/5.00

5.00 4.00 4.00 3.00 4.00 4.00 3.00 3.00 30.00

3.33/5.00

Creativity of Design AIAA Design Build Fly Micro Air Vehicle Small Micro-Aerial Vehicle Launch from an RC Plane Totals

1 2 3 4.00 4.00 4.00

4

5

6

7

8

Total 12.00 3.00 4.00 5.00 12.00 4.00 4.00 8.00

Avg 4.00/5.00 4.00/5.00 4.00/5.00

4.00 4.00 4.00 3.00 4.00 5.00 4.00 4.00 32.00

3.56/5.00

Quality of Design AIAA Design Build Fly Micro Air Vehicle Small Micro-Aerial Vehicle Launch from an RC Plane Totals

1 2 3 4.00 4.00 3.00

4

5

6

7

8

Total 11.00 4.00 3.00 4.00 11.00 3.00 3.00 6.00

Avg 3.67/5.00 3.67/5.00 3.00/5.00

4.00 4.00 3.00 4.00 3.00 4.00 3.00 3.00 28.00

3.11/5.00

Quality of Hardware AIAA Design Build Fly Micro Air Vehicle Small Micro-Aerial Vehicle Launch from an RC Plane Totals

1 2 3 4.00 4.00 3.00

4

5

6

7

8

Total 11.00 3.50 4.00 4.00 11.50 3.00 2.00 5.00

Avg 3.67/5.00 3.83/5.00 2.50/5.00

4.00 4.00 3.00 3.50 4.00 4.00 3.00 2.00 27.50

3.06/5.00

Level of Engineering Analysis AIAA Design Build Fly Micro Air Vehicle Small Micro-Aerial Vehicle Launch from an RC Plane Totals

1 2 3 3.00 4.00 3.50

4

5

6

7

8

Total 10.50 3.50 3.00 5.00 11.50 3.00 3.00 6.00

Avg 3.50/5.00 3.83/5.00 3.00/5.00

3.00 4.00 3.50 3.50 3.00 5.00 3.00 3.00 28.00

3.11/5.00

Quality of Presentation* AIAA Design Build Fly Micro Air Vehicle Small Micro-Aerial Vehicle Launch from an RC Plane Totals

1 2 3 4.00 4.33 3.50

4

5

6

7

8

Total 11.83 3.33 4.00 3.33 10.66 3.33 3.33 6.66

Avg 3.94/5.00 3.55/5.00 3.33/5.00

4.00 4.33 3.50 3.33 4.00 3.33 3.33 3.33 29.15

3.24/5.00

*Three categories (presentation, poster, student responses to questions) were averaged. Judge 1: Nathan Adams, Boeing Helicopter Company Judge 2: Erik Novak, Veeco Instruments Judge 3: Gary Spangenberg, Sargent Controls and Aerospace Judge 4: Brian Perry, Raytheon Judge 5: Ed Lake, Lockheed Martin Judge 6: Devon Campbell, Ventana Medical Systems Judge 7: Bo Faser, Lockheed Martin Judge 8: Douglas McClellan, University Medical Center

Page 50

Aerospace Engineering

Table 3.2.e Continuing assessment survey: senior aerospace engineering design projects (Spring 2004).

Aerospace Engineering

Page 51

and Mechanical Engineering, is given in Attachment 3.2.f. The core questions include “overall questions” about the course and about the instructor: 1. What is your overall rating of this instructor’s teaching effectiveness? 2. What is your overall rating of the course? (Q2 = question 2) 3. How much do you feel you have learned in this course? (Q3 = question 3) 4. What is your rating of this instructor compared with other instructors you have had? The evaluation has medium importance for assessment of the department’s educational objectives. The research (AER) shows that there are sources of systematic variation or even bias that should be considered (disciplinary differences, course level, and course size). However, the mean over a long period (especially when the course has been taught by few instructors) can highlight problems that are independent of the instructor, such as a poor text. In this respect, the TCE survey may be used to indicate possible aspects for further course improvement. Students answer Q2 and Q3 on a scale of 1 to 5, where 5 = “one of the best” and 1 = “one of the worst” for Q2 and 5 = “an exceptional amount” and 1 = “almost nothing” for Q3. Averages of the questionnaires for questions 2 and 3 (Q2 and Q3, respectively) for all the courses taught in AME department over the period Fall 1998-Fall 2003 are presented in Figure 3.2.q. The students’ average rating for AME courses (Q2) is above 3.5 and the rating for the amount learned from a course (Q3) correlates well with the measure for Q2. 3.2.6.

Fundamentals of Engineering Examination

One standardized measure of the performance of the Mechanical Engineering program is the success rate of its students on the Fundamentals of Engineering (FE) Examination administered by the National Council of Examiners for Engineering and Surveying (NCEES). Table 3.2.f compares the success rate of AME students taking this examination to the success rate of their peers across the nation for the two types of examinations available. The figures shown are averaged over six examinations administered between October 2000 and October 2003. Some additional data and discussion related to the performance of the University of Arizona AME seniors in these examinations are given below.  A total of 33 students took these examinations over the specified period and 31 of them passed.

This translates into an overall pass rate of 94 percent compared to an average pass rate of 88 percent nationally for the six examinations.  The number of AME students taking the exam over this period (October 2000-October 2003) is

small, despite the encouragement given to seniors to take the examination. The advising and senior check processes are appropriate times to remind students to take the examination. Also, on average, this number has not varied much over the years (12 students took the examination in 1996-1997).  The success rate of AME students has remained relatively constant over the years (92 percent in

1996-1997).  No systematic weaknesses are observed in the performance of AME students in individual subject

areas of the FE examination.

Page 52

Aerospace Engineering

Fall 1998 - Fall 2003 4.50

4.00

3.50

Mean

3.00

2.50

Q2 Q3

2.00

1.50

1.00

0.50

0.00 AME 195D

AME 210

AME 230

AME 250

Fall 1998 - Fall 2003 4.50

4.00

3.50

Mean

3.00

2.50

Q2 Q3

2.00

1.50

1.00

0.50

0.00 AME 300

AME 301

AME 302

AME 320

AME 321

AME 323

AME 324

AME 324a

AME 324b

AME 331

AME 352

Figure 3.2.q Averaged responses to question 2 (5 = one of the best, 1 =one of the worst) and question 3 (5 = an exceptional amount, 1 = almost nothing) on the Teacher-Course Evaluations (Fall 1998-Fall 2003).

Aerospace Engineering

Page 53

Figure 3.2.q—Continued.

Fall 1998 - Fall 2003 5.00 4.50 4.00 3.50

Mean

3.00 Q2

2.50

Q3

2.00 1.50 1.00 0.50 0.00 AME

AME

AME

AME

AME

AME

AME

AME

AME

AME

AME

AME

AME

AME

AME

AME

AME

AME

AME

AME

400

401

410

412a

412b

416

420

422

424

425

427

428

430

431

432

433

442

443

445

452

Fall 1998 - Fall 2003 5.00 4.50 4.00 3.50

Mean

3.00 Q2

2.50

Q3

2.00 1.50 1.00 0.50 0.00 AME 455

Page 54

AME 456

AME 460

AME 461

AME 461a

AME 462

AME 463

AME 466

AME 472

AME 474

AME 495s

Aerospace Engineering

Table 3.2.f Percentage of examinees passing from October 2000 to October 2003 (six examinations). General PM Exama Mechanical PM Exam AME Nationally AME Nationally 94 87 93 88 a The exams given in the afternoon (PM) are discipline-specific or general. Those given in the AM are common to all disciplines. AME students can take a discipline-specific exam or the general exam.

3.2.7.

Review and Assessment by Industrial Advisory Council (AME Advisory Board)

The transition from the previous Industrial Advisory Council (IAC) to the new AME Advisory Board has been described elsewhere in this report (Section B.2.2.2). The IAC provided valuable insight that resulted in a number of improvements described in Section B.2.2.2. These improvements include: developing a ProE laboratory course, adding CNC machining capability, and raising money from industrial partners. The IAC was not convened for an annual meeting in the spring of 2002 when AME had an Interim Department Head (Dr. Ganapol). It was not convened for an annual meeting in the spring of 2003 during the first year of the present Head’s tenure. The issue of re-configuring the IAC as recommended by the Academic Program Review report has been addressed by the new Head (McGrath) and the new AME Advisory Board will convene during the summer of 2004. While it clearly would have been preferable to include industrial representation in formal committee form (IAC or AME Advisory Board) during 2002 and 2003 for annual curriculum assessment, there has been ongoing contact with 5 of the IAC members during the past 2 years. It should be noted that the planned meeting of the new AME Advisory Board in the summer of 2004 is in full accordance with our defined 3-year cycle for assessment of our educational objectives. The contact with 5 of the IAC members during the past 2 years outside the context of the IAC meetings included contact with: Raytheon (Isadore Davis, Gary Burke, and Brian Perry), Competitive Engineering (Don Martin), Advanced Ceramics Research (Tony Mulligan), and Sargent Controls (Manny Teran). This contact has resulted in: (a) engineers from industry teaching AME courses and bringing industrial perspectives to bear (Raytheon: Crespo and Sobel); (b) submitting a proposal to Raytheon to enhance the AME machine shop (donation of equipment and tools); (c) developing a Manufacturing Engineering option program within the ME degree program in cooperation with Raytheon, Competitive Engineering, and Pima Community College; and (d) discussion of a partnership with Pima Community College to teach machine shop courses to AME students (Competitive Engineering). The AME assessment committee (i.e., the Undergraduate Studies Committee) has been involved in discussing all of these issues and interactions. Interactions with companies other than those in the former IAC have also been ongoing—in particular with Ventana Medical Systems (Kendall Hendrick) and Sebra (Loren Acker). Among other benefits, this interaction has helped with regard to supporting student design projects and linking students with industry for employment. 3.2.8.

Performance Assessment from Industry

Representative employers of our graduates are requested to fill out a performance survey every three years. Presented as Attachment 3.2.g, this survey specifically requests information on our graduates in the areas of technical ability, communication and professional growth, and eagerness to engage in life-long learning. Both the Department Head and the Undergraduate Studies Committee review the feedback. A summary of the survey results for 2001 is presented in Figure 3.2.r and for 2004 in Figure 3.2.s.

Aerospace Engineering

Page 55

TECHNICAL ABILITY: How satisfied are you with the engineer’s: (a) ability to identify, formulate and solve engineering problems using available tools? (b) ability to design and conduct experiments? (c) ability to analyze and interpret data/information? (d) ability to apply knowledge of mathematics, including probability and statistics? (e) ability to apply knowledge of physical science? 3.57

3.71

a

b

3.86 3.29

c

d

3.57

e

COMMUNICATIONS AND PROFESSIONAL GROWTH: How satisfied are you with the engineer’s: (a) ability to prepare and give oral presentations? (b) ability to write reports? (c) ability to grow professionally? (d) understanding of ethical responsibility? (e) understanding of the value of diversity among employees?

3.14

3.57

3.57

c

d

3.43

3.00

a

b

e

EAGERNESS TO ENGAGE IN LIFE-LONG LEARNING: How satisfied are you with the engineer’s: (a) ability to function on multi-disciplinary teams? (b) ability to learn new skills or methods? (c) ability to apply knowledge of mathematics, including probability and statistics? (d) ability to solve problems with creativity and innovation? (e) ability to demonstrate technical competency in an appropriate field? (f) ability to change to changing job requirements? 3.71

3.86 3.29

a

b

c

3.57

3.71

d

e

3.43

f

Figure 3.2.r Performance assessment (2001) of AME graduates made by industry representatives: Advanced Ceramics Research, Competitive Engineering, Inc., Honeywell Engines and Systems, Intel Corporation, Raytheon Missile Systems Company, Sargent Controls and Aerospace, and Veeco Process Metrology (4 = Very Satisfied; 1 = Dissatisfied).

Page 56

Aerospace Engineering

TECHNICAL ABILITY: How satisfied are you with the engineer’s: (a) ability to identify, formulate and solve engineering problems using available tools? (b) ability to design and conduct experiments? (c) ability to analyze and interpret data/information? (d) ability to apply knowledge of mathematics, including probability and statistics? (e) ability to apply knowledge of physical science? 3.56

3.44

3.44

3.44

c

d

e

3.00

a

b

COMMUNICATIONS AND PROFESSIONAL GROWTH: How satisfied are you with the engineer’s: (a) ability to prepare and give oral presentations? (b) ability to write reports? (c) ability to grow professionally? (d) understanding of ethical responsibility? (e) understanding of the value of diversity among employees? 3.67 3.22

3.22

a

b

c

3.22

3.11

d

e

EAGERNESS TO ENGAGE IN LIFE-LONG LEARNING: How satisfied are you with the engineer’s: (a) ability to function on multi-disciplinary teams? (b) ability to learn new skills or methods? (c) ability to apply knowledge of mathematics, including probability and statistics? (d) ability to solve problems with creativity and innovation? (e) ability to demonstrate technical competency in an appropriate field? (f) ability to change to changing job requirements? 3.56

3.56

a

b

3.33

3.44

3.44

3.44

c

d

e

f

Figure 3.2.s Performance assessment (2004) of AME graduates made by industry representatives: Advanced Ceramics Research, Inc.; GLHN Architects and Engineers; Infrared Labs.; Kahr Bearing Division of Sargent Controls; ReliaSoft; Sandia National Labs.; Ventana Medical Systems; 2 anonymous (4 = Very Satisfied; 1 = Dissatisfied).

Aerospace Engineering

Page 57

Interestingly enough, with reference to the 2001 survey, industry feels very positive about our graduates working on multi-disciplinary teams (second highest rating at 3.7/4.0) while they feel less strongly (3.1/4.0) about the communication ability (just about in reverse to the data in Figure 3.2.d, i.e., students rate communication as being the most important to their careers and working on multi-disciplinary teams as being less important). Clearly, each of the assessment tools provides somewhat different information, and it is important to fully integrate all results into a composite picture in order to identify the most important areas that need improvement. This survey indicates that the preparation of the students in mathematics (and two related areas), while still good (82%), should be examined by the department. It is no secret that the mathematics department does not provide a strong teaching effort in relation to the four service courses that form the foundation of college-level engineering mathematics. Industry feels that AME students are well-prepared technically, can function as part of a team, and are able to learn on their own. On the other hand, the 2004 survey indicates marginally better scores for communication by oral and written reports (and lower scores for ethical responsibilities and issues related to diversity). A most striking result is the significantly lower result (from 3.71 to 3.00) in the design and conduct of experiments. Also, questions related to life-long learning received lower scores. Because of the smallness of the data base, it would be unwise to assign considerable weight to these results at this time. Nevertheless, there are indicators that certain aspects of the program must be carefully monitored. This is the usefulness of the assessment and feedback process of accreditation. During the fall semester, the 2004 data will be examined by the Undergraduate Studies Committee in light of the 2001 data. 3.2.9.

Faculty (Undergraduate Studies Committee) Assessment of Curriculum

The creation and use of a self-assessment form is under consideration. The instructor for each course would fill out such a form to indicate the “big picture” and to provide some feedback on “what happened” in the course. Students do this via the course evaluation forms, but it is very important to have the complementary information from the faculty. A brief form is proposed that addresses:       

Student preparation in mathematics. Student preparation in basic science courses. Student preparation in prerequisite engineering courses. The extent to which course objectives were achieved. What changes should be made in the course. General impressions. Recommendations (especially, in senior-level courses) that would provide a competitive edge to our graduates.

3.2.10. Academic Program Review The documentation for the Academic Program Review (APR) is contained in two written reports: the Self-Study Report prepared by the AME department, and the Review Report authored by the members of the visiting team. These reports are available to the ABET visitor upon request to the department administration.

Page 58

Aerospace Engineering

The APR, mandated by the Arizona Board of Regents, occurs on a seven-year cycle. The last review took place during March 19-21, 2001. A summary report of the findings is transmitted to the Regents; both of the detailed written reports are transmitted to the various administrative levels within the University, all the way up to the Provost’s office. The visiting team consisted of three distinguished academicians (NAE members), members from industry and alumni (2), and non-AME faculty members from the University (2). The reports deal with the academic programs (Aerospace and Mechanical, undergraduate and graduate), faculty, staff, research, facilities (undergraduate and research), and the administration of the department. The visiting team (committee) is “unanimous in the opinions and conclusions presented” in their Review Report. The issues most relevant to the discussion herein are summarized below in direct quotations: Strengths  “We find general dedication to teaching excellence among most of the faculty, including those

who are active in research as well as faculty who are not.”  “The [undergraduate] programs are well regarded by students who were interviewed. The

students feel that they benefited from a ‘common sense’ approach in which they had been inculcated by certain highly regarded teachers.”  “The department is currently housed in excellent facilities in the new building, with ample

space for the teaching and research programs.”  “The support staff is very well qualified, highly motivated, and satisfied.”

Weaknesses  “The involvement of undergraduates in the active research programs is somewhat limited in

scope and numbers.” We note that the students came to a similar conclusion (Figure 3.2.j).  “…might require increasing the number of regular faculty positions, which is currently rather

low relative to the size of the educational programs.”  “The Aerospace degree curriculum is in need of an introductory course in heat transfer.” We

might point out that such a course is available as a technical elective (AME 432) from mechanical engineering. Finally, the report provides recommendations and comments on strategic issues. These are especially important because “there is not a shared vision of the future by the department faculty.” The departmental By-Laws “appear to be divisive, rather than unifying.” In preparation of the appointment of a new (external) department head, it is important that the AME senior faculty “demonstrate leadership for the good of the whole” and the “Dean needs to spell out exactly what is expected of ‘productive faculty’.” As mentioned elsewhere in this report, the APR report also suggests that the responsibilities of Associate Head(s) should be enlarged, that the IAC responsibilities be broadened, and that new directions and external linkages be created/enhanced. The By-Laws have been modified, the Department Head has been working with senior faculty to demonstrate leadership, the Dean was

Aerospace Engineering

Page 59

invited to an AME Faculty Meeting to discuss the College of Engineering vision, and a new department administrative structure has been implemented that enlarges the responsibilities of the Associate Head for Undergraduate Studies and the Associate Head for Graduate Studies and Research. Finally, a new AME Advisory Board has been appointed that has broader responsibilities than the former IAC and external linkages have been created with the University Medical Center, and re-emphasized with the Optical Science Center and the Lunar and Planetary Laboratory. 3.2.11. Job Placement Data One measure of the success of the Aerospace and Mechanical Engineering programs is the success of graduates in obtaining employment. Unfortunately, employment success is more a function of the economic situation than it is of the department. We obtained a summary of the employment of our graduates from the alumni survey. This summary is provided in Table 3.2.g. There were a total 64 responses to the 2002 and 2003 surveys (not all supplied employment data). An analysis of these shows where students who graduated over the last 5 years are working. Not surprisingly, the largest single employer of AME graduates is Raytheon Missile Systems Company in Tucson. Several large companies have continued to recruit here over the past several years (Table 3.2.h). Table 3.2.g. Summary of employment of AME graduates. Company Title Company Accenture Consultant Nissan Technical Center Advanced Controls Corp Sales Engr NOAO Aerospace Corp Member Tech Staff Raytheon Alcon Assoc Engr Raytheon Arizona Electric Power CoResults Engr I Raytheon op ASE Technologies Project Engr Raytheon Bechtel Nevada Engr Raytheon Bell Helicopter Engr Raytheon Boeing Helicopters Hydraulics Engr Raytheon Brown and Caldwell Assoc Engr Simma-ASD BWM Builders LLC Member Simplex Grinnell Chicago Bridge and Iron Design Engr Struble-Welsh Engineering Cienega High School Math Teacher Talley Defense Systems, Inc. Department of Defense Aero Engr Teradyne, Inc. Department of Defense Test Director TRW Draka Elevator Products Inc Process Engr Tucson Rubber Co Galgano and Burke Attorney Unemployed (3) Honeywell International Systems Engr University of Arizona Industrial Automation Service Engr University of Pittsburgh Kaman Aerospace Corp Opto Mech Engr US Navy Lockheed Martin Systems Engr Ventanna Medical Systems Lockheed Martin Mech Design Engr Visteen Corp Lockheed Martin Aero Design Engr Vroom Eng & Mfg Lockheed Martin Instrumentation W. L. Gore and Assoc Engr Motorola BCS Manufacturing Engr Western Digital NASA Langley Aero Engr Western Welding Co

Page 60

Title Durability Engr Mech Engr Mech Engr Sr Systems Engr Engr II Multidisciplined Engr I Systems Engr II Systems Engr Mfg Engr Applications Engr Bldg Systems Sales Rep Sr Engr Sr Weapons Systems Engr Mech Engr Member Of Tech Staff Plant Engr Operation Manager Resident Physician Pilot Sr Mech Design Engr Product Engr Mech Engr Quality Engr Sr Engr, Mfg Foreman

Aerospace Engineering

Table 3.2.h Sample companies conducting on-campus recruitment of Aerospace Engineering students.

AFG Industries BNSF Railway Boeing Company Dietrich Metal Framing EXXON/Mobil Honeywell Jet Propulsion Labs KLA-Tencor Lockheed Martin Raytheon TRANE USG Corporation 3.2.12. Life-Long Learning A most important message to communicate to students is that they must learn. Teaching by itself is not adequate. This is a message that our graduates must take to the workplace—they must become independent and willing learners, on their own, if they want to be successful. In today’s rapidly changing technical environment, “it is learn or perish.” The AME department has taken a number of innovative steps to emphasize the importance of lifelong learning:  The faculty were given an opportunity to have their syllabi reviewed and analyzed by a

librarian and received feedback describing how information literacy could be seamlessly incorporated into the course content and then measured.  A collaborative effort, involving the Instructional Development and Assessment Specialist

(Barbara Williams) in the Learning Technologies Center, the coordinator of ENGR 102, and the library liaison, to develop a tutorial designed to teach life-long learning skills is now in the planning stages. The selling point of this tutorial is that every learning objective will be measurable to make sure we are producing life-long learners. The approach is to suggest a set of basic life-long learning skills that would be specific to all 100/200-level courses and another for 300/400-level courses. 3.2.13. Summary of Assessment Results A number of assessment tools and results have been presented. Prior to considering how improvements have been made as a result of assessment activities and what remains to be accomplished, it is worth summarizing and providing a succinct interpretation of what has been learned in light of the defined learning outcomes and objectives. An important consideration for much of the data obtained from the assessment tools is that the sample size is relatively small for some of the assessment survey tools.

Aerospace Engineering

Page 61

Learning Outcomes

Figure 3.2.c represents the results of the Senior Exit Survey describing how well seniors perceive that they have mastered our defined Learning Outcomes and ABET criteria a-k. It is recognized that there are limits to perception of what has been learned. As direct evidence that our students have demonstrated achievement of these criteria and Learning Outcomes, examples of student work in specific classes will be available during the ABET visit. Table 3.1.c maps individual courses into the Aerospace Engineering Learning Outcomes. Consequently, examples of student work will be available at the time of the ABET visit to demonstrate student achievement of the ABET criteria and defined below. • Can integrate knowledge of mathematics, science, and engineering to model and analyze problems [3a-c, 3e] In the Senior Exit Survey, Aerospace Engineering students graduating from the AME program indicate that they perceive that they have been taught best in the areas of applying math/physics and solving engineering problems. AME alumni agree with graduating seniors that they are best prepared in the areas of applying math/physics and solving engineering problems. AME student performance on the Fundamentals of Engineering Examination exceeds national averages and suggests that AME students are well prepared to integrate knowledge of mathematics, science, and engineering to model and analyze problems. On the other hand, feedback from industry (Performance Assessment Survey) suggests that industry would like to see improvement in AME alumni performance in the areas of math/probability/statistics. The input for this learning objective from all other assessment tools identified in Table 3.2a did not identify additional concerns or issues to address. Interpretation/Conclusion/Past Actions: This is an area of mixed responses. On the one hand graduating seniors feel that they are well prepared in this area and the results obtained from a nationalized examination suggest that they are. Some, but not all, assessment results from industry suggest that AME students could be better prepared in the areas of math/probability/statistics. In contrast, alumni with industrial experience feel that they are strong in this area. No action has been taken to make changes since it is not clear at this point what specific changes should be implemented. Future Actions: Future surveys should be refined to identify what the specific shortcomings are. It may be that industry is satisfied with mathematics preparation but would like to see improvement in probability and statistics. It is also important to gather a larger data set. The sample size from industry should be enlarged. The AME department should examine its own delivery of engineering mathematics courses and communicate with the Mathematics Department as appropriate to address specific shortcomings once they have been identified. • Can use state-of-the-art resources to solve engineering problems [3a-c, 3e, 3i, 3k] AME alumni agree with graduating seniors that they are best prepared in the areas of applying math/physics and solving engineering problems. It is recognized that the definition of “state-of-theart” will be different for different industries. However, our database spans a range of industry that includes high technology employers (e.g., Raytheon) such that any significant deficiencies in our students with respect to “state-of-the-art” deficiencies will be identified. Our surveys (Senior Exit Survey, Alumni Survey, and the industry Performance Assessment Survey) specifically address the use of “modern tools” (ABET item k). Industry input has not identified any concerns suggesting that AME alumni are not capable of using state-of-the-art tools to solve engineering problems. The input

Page 62

Aerospace Engineering

for this learning objective from all other assessment tools identified in Table 3.2a did not identify additional concerns or issues to address. Interpretation/Conclusion/Past Actions: AME students are well prepared in this area. However, feedback from local industry, via the Industrial Advisory Council, suggested that student abilities in this area could be further strengthened by introducing an elective course combining CNC machining and an introduction to computer-aided design software (ProE). Such a course was created and offered with the help of industry. Due to budget cuts the CNC machining aspect of the course was dropped. The course is now jointly offered with Agricultural and Biosystems Engineering. Based on student input, a new finite elements course (AME 463) was introduced, of which a major part is the use of ANSYS. The software tool, MatLab, was incorporated into the course on numerical methods (AME 302). Department discretionary funds have been invested in this computer software for teaching purposes. In response to the Academic Program Review assessment, a lead Information Technology staff person (Systems Analyst, Principle) and support staff have been hired to improve computer support to students. A significant investment of state budget money (~$100,000) was made two years ago to create the Computer Teaching Center. Course fees have been implemented in ABE 320 (formerly AME 210) that support course software. An additional investment (~$20,000) was made last year in establishing an AME Server room. Some $30,000 of discretionary funds was invested during the 2003-04 academic year to develop a laboratory component of AME 455. This investment was in the form of 24 computers and workstation kits to enable “hands-on” controls experience for students. Feedback from local industry, via the Industrial Advisory Council, revealed the need for a manufacturing/rapid prototyping course. There is now a larger issue that involves the creation of a 2+3 program in manufacturing engineering with Pima Community College. AME, local industry, PCC, and other engineering departments within the college are the stakeholders. The plans for the 2+3 program should be finalized during the summer of 2004. • Can apply engineering knowledge to design and build processes and systems [3a-c, 3e, 3k] As Aerospace Engineering students graduate from the AME program, they tell us (Senior Exit Survey) they perceive that they are very well prepared in applying engineering knowledge to design and build processes and systems. AME alumni agree with graduating seniors in this assessment. The input for this learning objective from all other assessment tools identified in Table 3.2a, including industry, did not identify additional concerns or issues to address. The AME program features a fullyear capstone design experience and machine shop training. Unlike some programs at other universities, this experience includes the building phase (using the AME machine shop). Interpretation/Conclusion/Past Actions: AME students are well prepared in this area. No corrective action has been necessary. Feedback from local industry, via the Industrial Advisory Council, greatly increased the sources (and resources) for the capstone design projects, where now most of them come from and are sponsored by. The evaluation of the projects is also performed by members of industry (local, Phoenix area, and Flagstaff). • Can plan experiments, analyze data, and interpret results [3a-c, 3e, 3k] As Aerospace Engineering students graduate from the AME program, they tell us (Senior Exit Survey) they perceive that they are very well prepared to plan experiments, analyze data, and interpret results. AME alumni agree with graduating seniors in this assessment. None of the other assessment tools identified in Table 3.2a identified additional concerns or issues to address. Aerospace Engineering

Page 63

Interpretation/Conclusion/Past Actions: AME students are well prepared in this area. No corrective action has been necessary. Future Actions: The input for this learning objective from the most recent industry survey (Spring 2004) indicated that alumni performance in planning and conducting experiments deserves attention. The sample size should be increased, specific details defining shortcomings should be sought, and a survey should be undertaken within the next year to address this issue. • Can communicate effectively (oral and written) [3c, 3d, 3g, 3h] As Aerospace Engineering students graduate from the AME program, they tell us (Senior Exit Survey) they perceive that they are well prepared to communicate effectively. They also indicate that this skill is one of the most important to their careers. AME alumni express a stronger sense of confidence in their communication skills than graduating seniors. Feedback from industry (Performance Assessment Survey) suggests that industry would like to see improvement in AME alumni performance in communication skills. The input for this learning objective from all other assessment tools identified in Table 3.2a did not identify additional concerns or issues to address. Interpretation/Conclusion/Past Actions: This is an area with conflicting feedback. Graduating seniors and alumni in the workplace feel that they can communicate effectively. Industry representatives would like to see improvement. No action has been taken to date. Future Actions: Future surveys and interactions should focus on industry and they should be designed to identify what specific areas of communication skills should be improved. The means of making those improvements should be identified and implemented. The number of companies surveyed should be enlarged to enhance the confidence level in the data sampled. • Can function in multidisciplinary teams [3c, 3d, 3f-h, 3j] As Aerospace Engineering students graduate from the AME program, they tell us (Senior Exit Survey) they perceive that they are very well prepared to function in multidisciplinary teams. AME alumni express a stronger sense of confidence in their ability to function in multidisciplinary teams than graduating seniors. Feedback from industry (Performance Assessment Survey) supports the alumni perception that AME graduates perform very well as team members. The input for this learning objective from all other assessment tools identified in Table 3.2a, including industry, did not identify additional concerns or issues to address. Interpretation/Conclusion/Past Actions: AME students are well prepared in this area. However, the AME faculty recently approved the concept of coordinating the department capstone design class (AME 412a/b) with the College of Engineering design class (ENGR 498a/b). The latter class is specifically a multidisciplinary experience. The former may or may not be explicitly multidisciplinary. This coordination involved significant scheduling changes within the AME curriculum for the purpose of allowing AME students to choose AME 412a/b or ENGR 498a/b design projects. This provides enhanced opportunities for a multidisciplinary experience for those students seeking it. • Can exercise professional, ethical, and social responsibilities and engage in life-long learning [3f, 3h-k] As Aerospace Engineering students graduate from the AME program, they tell us (Senior Exit Survey) they perceive that they are well prepared to exercise professional, ethical, and social Page 64

Aerospace Engineering

responsibilities and engage in life-long learning. AME alumni feel that they are weaker in the “soft” aspects of their engineering education compared to the “hard” aspects. However, none of the assessment tools targeted to industry have identified shortcomings in our graduates in relation to this learning outcome. Indeed, we have industry input that suggests that our graduates do conduct themselves in a professional and responsible manner and continue to be active learners in the workplace. The input for this learning objective from all other assessment tools identified in Table 3.2a did not identify additional concerns or issues to address. Interpretation/Conclusion/Past Actions: The balance between “soft” and “hard” aspects of the Aerospace Engineering education is reasonable. AME students are well prepared in this area. No corrective action has been necessary. Educational Objectives

We believe that we are able to measure aspects of our Learning Outcomes using multiple assessments tools and that the data obtained from these tools demonstrate that the students in the Aerospace Engineering program have demonstrated achievement of the Learning Outcomes. Due to the explicit mapping between the Learning Outcomes and the Educational Objectives (Table 3.1.a), we submit that the integrated success of achieving the Learning Outcomes provides evidence that the Educational Objectives are being met successfully. Additional Feedback from Assessment Tools

Our assessment tools provide us with feedback in addition to that directly tied to the Learning Outcomes. This information relates to facilities, personnel, and services that impact our objectives and learning outcomes. In particular, our assessment tools provide us with feedback concerning: academic standards, student advising, faculty, staff, teaching assistants, teaching laboratories (physical and computer), and design experience. Some 75-82% of the graduating Aerospace Engineering seniors rate the quality of their experience with Design, Computer Labs, Faculty, Office Staff, and Machine Shop Staff as “good” or “excellent.” They report less satisfaction with TAs, Advising, and Physical Labs. These areas therefore warrant attention. The alumni (Alumni Survey; College of Engineering survey) agree with graduating seniors in most respects. The alumni feel that the AME department has very high academic standards and they agree with the graduating seniors that faculty and staff rate very high. They also agree with seniors in that advising, TAs, and laboratories are rated somewhat lower than other areas. The alumni (Alumni Survey) also specified that they wanted more hands-on and “meaningful” experiences in the physical and computer labs. The alumni report that the AME program provided them with good opportunities for participating in extracurricular activities, but they would welcome more opportunities to participate in research, independent studies, and internships. They also suggest business course(s) in the curriculum. The response to the Additional Feedback from Assessment Tools is described below. Focal Areas for Improvement

The data described above suggest that the focal areas for improvement are: communication skills, math/probability/statistics, student advising, TAs, laboratory experiences, “soft” skills, business courses, research experience, independent studies, and internships. In addition to these areas, we identify current implementation and future actions that will improve the assessment and continuous improvement process.

Aerospace Engineering

Page 65

3.2.14. Improvements as a Result of Assessment Activities Communication Skills

No action has been taken to improve the communication skills of our students. While there is some indication from industry that stronger communication skills would be welcomed, it is not clear at this time that this is a significant issue. This matter needs to be discussed with the new AME Advisory Board and future surveys of industry need to be refined to identify any specific shortcomings. It is also important to gather a larger data set. The sample size from industry needs to be enlarged. Math/Probability/Statistics

No action has been taken to improve the Math/Probability/Statistics skills of our students. While there is some indication from industry that stronger skills in this area would be welcomed, it is not clear at this time that this is a significant issue. This matter needs to be discussed with the new AME Advisory Board and future surveys of industry need to be refined to identify any specific shortcomings. It is also important to gather a larger data set. The sample size from industry needs to be enlarged. There may be an important distinction between Mathematics on the one hand and Probability and Statistics on the other that has not been identified to date. Student Advising

In response to the Academic Program Review and student surveys, a new administrative structure has been defined in AME that includes the appointment of an Associate Department Head for Undergraduate Studies and an advising-trained Aerospace Engineering PhD student. This has been initiated to emphasize the importance of undergraduate studies in general and on student advising in particular. This also represents an effort to improve the quality of student advising by teaming a faculty member who is an award-winning teacher with a PhD student advisor who “bridges the gap” between students and the faculty. Since our past surveys only probe seniors and alumni, it is too early for the results of our current assessment tools to reveal the impact of this change. Anecdotal feedback from a spectrum of current students suggests that this change is well received. In the spirit of continuous improvement we will implement two improvements of the current assessment method for evaluating student advising. We will collect data from freshmen, sophomores, and juniors, as well as seniors. This will produce more “current” data and potentially allow us to distinguish problems that may be specific to particular stages of the student’s academic career. We will also create a survey designed to identify specific problems and invite students to suggest solutions. The new advising team is in the process of implementing Web-based advising materials. The first aspects of this development should be implemented on the AME website during the summer of 2004. Further development is planned for the 2004-05 academic year. Teaching Assistants

The Academic Program Review report recommended: “The assignment of some Teaching Assistants to various courses could benefit from additional attention by an experienced faculty or staff member. This could be part of the responsibility of an Associate Department Head.” A new administrative structure has been defined that includes the appointment of an Associate Department Head for Graduate Studies and Research. This appointment is in response to the Academic Program Review assessment feedback that recommended appointment of associate heads

Page 66

Aerospace Engineering

for the undergraduate and graduate programs—with significant responsibilities for both. As a result, more attention has been given to the selection of teaching assistants (TAs), their assignment to specific courses based on their skills, and the review of their evaluations. All international students who want to be TAs must pass a University of Arizona language competency test, and faculty supervisors evaluate their respective TAs each semester. These evaluations are reviewed by the Associate Department Head for Graduate Studies and Research and the Department Head. TAs are reappointed or not on the basis of the quality of their evaluations. Since the ratings of TAs by the senior exit survey and alumni survey suggest that improvement can be made, there are evidently shortcomings identified by the students that are not being adequately captured by the faculty evaluations and other mechanisms in place. The current assessment process will be improved by implementing a student survey of the TAs in each course to which a TA has been assigned. The TA surveys will be designed to identify what the problems are and solicit suggestions regarding how to address the problems. Laboratory Experiences

The Alumni Survey also specified that the alumni want more hands-on and “meaningful” experiences in the physical and computer labs. In general the computer and physical lab experiences are designed to be “meaningful” in the sense that they are created by faculty with some combination of industry experience, industry contacts, and active research programs. Two years ago, in response to this input and that from industry, the AME computer infrastructure (Computer Teaching Center and AME Server Room) was improved using ~$100,000 of salary support associated with open faculty lines to purchase state-of-the-art computers and software. The AME Department also invested approximately $25,000 of discretionary funds to support the development of “hands-on” laboratory exercises to demonstrate DC motor control. Twenty-four personal computers with micro-control programmers, breadboards, and micromechanical kits were purchased to provide 24 experimental workstations. Further opportunities to inject funding into the physical laboratories have not arisen. Professor Enikov (with Professor Cuello) recently received a $100,000 grant from the Nanoscale Science and Engineering (NSE) Program, the Division of Engineering Education and Centers (EEC), and the Directorate for Biological Sciences (BIO) at the National Science Foundation, for work under a Nanotechnology in Undergraduate Education (NUE) Award to develop a set of undergraduate laboratory experiments that are focused on the nano-scale surface science of biosensors. The proposed laboratory development is part of a multi-departmental master plan for the creation of a college-wide undergraduate curriculum on micro- and nano-technologies addressing the needs not only of undergraduate engineering students but also of students from other sciences, such as biosciences, optics, and physics. It is being designed with the mission of inspiring a broad spectrum of students to pursue scientific careers in the area of nanotechnology. While there is no reason to believe that AME students are inadequately prepared with respect to hands-on and “meaningful” lab experiences, there is always room for improvement. The new AME Advisory Board will be consulted on this issue, as will AME alumni and employers not represented on the AME Advisory Board. In particular, specific ways in which more hands-on experiences and “meaningful” experiences are needed will be defined. Since there are no line-item state funds allocated for improvements in this area, we plan to work with our alumni, AME Advisory Board, industry partners, and department friends to raise the funds necessary. Furthermore, it is expected that special individual course fees and/or more comprehensive differential tuition/fees for engineering students will be implemented that will provide the funding required.

Aerospace Engineering

Page 67

“Soft” Skills

While some alumni indicate that they do not feel as strong in the “soft” skills compared to the “hard” aspects of their engineering curriculum, the graduating seniors and industry do not report problems in this area. No changes in curriculum have been made in response to the alumni assessment input. Business Courses

Alumni indicated an interest in taking more business courses. A formal opportunity to do so now exists. The Arizona Board of Regents approved a B.S. Engineering Management degree program in March 2002 at the University of Arizona that combines management courses and engineering classes to prepare graduates for positions that require broader capabilities than those provided by a business or an engineering degree alone. The Engineering Management Program at UA can also be combined with traditional engineering programs and lead to a double major with an additional 30 credits of work, essentially the 30 credits of managerial experience coursework. Alternatively, the 18 credits of technical electives can be used to obtain a formal minor, designated on the degree, in one of the traditional engineering fields. Research Opportunities

Research opportunities for students to interact with faculty on research grants are actively promoted by the undergraduate student advising team. This is done during orientation meetings, as well as during meetings between advisors and individual students. The advice of the AME Advisory Board will be sought relative to defining means of supporting students who work on research projects. One mechanism could be to request re-instatement of the policy of the previous VP for Research, who returned the overhead associated with undergraduate support on sponsored projects to the Principle Investigator. Independent Study Opportunities

Independent study opportunities for students to interact with faculty are actively promoted by the undergraduate student advising team. This is done during orientation meetings, during meetings between advisors and individual students, as well as during meetings between faculty advisors of student activities (such as AIAA) and the member students. Internships

Internship opportunities for students to interact with faculty are actively promoted by the undergraduate student advising team. This is done during orientation meetings, as well as during meetings between advisors and individual students. Formal courses (AME 193, AME 293, AME 393, AME 493) are available for all four years of the curriculum for students to enroll to receive academic credit for Internships with industry. Previously students were not allowed to be paid for an internship and also receive academic credit. The new policy allows students to be paid and receive academic credit. This new policy appears to have generated significant interest in internships according to the AME Undergraduate Advisor and the number of students participating has increased. Assessment Process

The assessment process has been implemented and the assessment process is closed, but further improvement is required. The elements of the process and the timing are shown in Figure 3.2.a. The constituents are defined, as well as the vehicles used for assessment. The assessment structure is

Page 68

Aerospace Engineering

comprehensive and appropriate. All features of the structure are implemented with one exception: the Undergraduate Advisory Committee that will be created in fall semester 2004. All vehicles/ assessment tools are in place and implemented. All constituents are participating in the assessment process as described. The assessment process needs to be further strengthened by closing the process in a more complete, focused, and timely manner. In particular, while assessment data are being collected in a timely manner, the data have not been reviewed in a rigorous and systematic manner as described in the assessment structure (Figure 3.2.a). While the Undergraduate Studies Committee has been active and successful improving the undergraduate curriculum, it has not been charged with the primary responsibility of implementing the assessment structure shown in Figure 3.2.a. This committee will henceforth be charged with this responsibility. In particular, this means that all assessment data will be directed to the Undergraduate Studies Committee. This committee will analyze the data, interpret it, and recommend actions to the faculty periodically (at once per year). The Undergraduate Studies Committee will work with the discipline-specific faculty subgroups to review the individual course assessment data and bring their findings to the faculty annually at either a faculty retreat or a faculty meeting focused on individual course assessment issues. The Undergraduate Studies Committee will be responsible for integrating the individual course assessments into a curriculum assessment that will be brought to the faculty for review each year. Using the assessment data collected from all constituent contributions, the Undergraduate Studies Committee will work with the ABET Committee every three years to review the overall curriculum, the Learning Objectives, and the Educational Objectives and bring these to the faculty for review and approval. AME Plan to Maintain and Upgrade Physical Labs, Computer Labs, and Machine Shop

The department has an Undergraduate Laboratory Committee and a Computer Committee, as well as technicians and Information Technology staff involved in maintaining and upgrading physical and computer laboratories. The Computer Committee has been involved in department-level specification of hardware and software infrastructure needs. The Undergraduate Laboratory Committee has defined current needs in the three major AME teaching laboratories (AME 300, AME 400, and AME 401) and discussed a plan to upgrade the physical laboratories. Specific needs have been identified. Discretionary funds have been allocated to support Mr. Lou Willis to help in the AME 401 teaching laboratory.

Due to state budget cuts in the past several years, upgrading laboratory equipment and computers has been very challenging. Although tuition has been raised twice in the past two years, none of that money has been available at the department level for laboratory upgrades or other purposes. Consequently, the Undergraduate Laboratory Committee, the faculty, and Department Head have discussed alternative plans for upgrading teaching laboratories and the AME Machine Shop. This shop is used for teaching purposes, as well as research support (AME 413a/b). A proposal has been submitted to Raytheon requesting machine equipment and tools. A plan to implement a differential fee for engineering students has been developed at the college level. Due to recent tuition increases, the Arizona Board of Regents would not consider such a plan this past year. Efforts will continue to implement such a plan. If implemented in its current form, the level of support generated for AME would be approximately $125,000 to $150,000 per year. In the meantime, when approved by the University Fees Committee, modest course fees (