A new course evaluation process - Education, IEEE ... - IEEE Xplore

2 downloads 0 Views 240KB Size Report
Abstract—This paper traces the evolution of the course eval- uation process in Drexel University's College of Engineering as we adapt to the Accreditation Board ...
IEEE TRANSACTIONS ON EDUCATION, VOL. 43, NO. 2, MAY 2000

125

A New Course Evaluation Process Kevin Scoles, Member, IEEE, Nihat Bilgutay, Senior Member, IEEE, and Jerene Good

Abstract—This paper traces the evolution of the course evaluation process in Drexel University’s College of Engineering as we adapt to the Accreditation Board of Engineering and Technology 2000 accreditation requirements. We discuss the goals of the change, the development of the format used, the evaluation procedures, and the successes and problems encountered. This process has also been supported by the Gateway Engineering Education Coalition. Index Terms—Course evaluation.

I. INTRODUCTION

H

ISTORICALLY, course evaluations in the College of Engineering at Drexel University have been done on a voluntary basis, with no formal process for dissemination of the outcomes, nor any formal feedback loop for continuous improvement of curriculum. The aim of this initiative is to develop a formal process that is implemented regularly and uniformly across the college, including every course each term, with results documented, analyzed, and disseminated to students and faculty, and ultimately resulting in a continuous quality improvement (CQI) process. Two key factors encourage the development of this process, namely, evaluation of the new Drexel engineering curriculum and the Accreditation Board of Engineering and Technology (ABET) 2000 criteria requiring outcomes based assessment for accreditation. With the recent updating of our upper-level undergraduate programs, we have seen a complete restructuring of the five-year cooperative education based engineering curricula over the past decade [1]. The Drexel Engineering Curriculum (tDEC), which has evolved directly from the E4 program [2]–[4], comprises the first two years of the new curriculum. This is common to all engineering disciplines with the exception of two courses that are discipline specific. Subsequently, the individual departments restructured their upper division curricula to build on this common engineering core curriculum. This was done by modernizing the disciplinary contents and further strengthening the key elements introduced in the first two years (i.e., experiential learning, teamwork, project and design oriented curricular focus, extensive use of computers, infusion of educational technology, etc.). The new Electrical and Computer Engineering curriculum, ECE21, is a representative example of this effort [5]. Most departments in the college have been doing some form of classroom assessment. The Chemical Engineering Department had been doing course evaluations for more than 20 years. This work was supported in part by the Education and Centers Division of the Engineering Directorate of the National Science Foundation under Awards EEC-9 109 794 and EEC-9 727 413. The authors are with the College of Engineering, Drexel University, Philadelphia, PA 19104 USA. Publisher Item Identifier S 0018-9359(00)04482-4.

Their process is managed by a student honor society. The Electrical and Computer Engineering (ECE) Department has been doing evaluations voluntarily and based on need (i.e., as required by tenure and promotion processes). In the past year, the ECE Department has moved to a mandatory and uniform course evaluation process and has actually piloted the new Web-based course evaluation in the College of Engineering. The course evaluation completed by students is just one piece in the college’s CQI plan. Other elements of the plan include: • evaluation of the program objectives of each department and program (i.e., tDEC); • course evaluation by students; • course evaluation by faculty; • senior exit interviews; • alumni surveys; • co-op employer feedback. The goals of the new course evaluation process are: • to adapt our course evaluation tools to an outcomes-oriented style, while retaining the course/faculty review content of the previous evaluation form; • a phased-in implementation over the 1998–1999 academic year, with 100% coverage of college courses in Fall 19992000. This paper gives a historical background on how the course assessment process evolved, and describes how it is delivered to students and how the information will be used. II. THE DEVELOPMENT PROCESS A. Summer 1997–1998 The authors began the process of developing a uniform format for assessment across the college in the Summer quarter, 1997–1998, with support from the Gateway Engineering Education Coalition. The starting point was a reevaluation of the questions then used in the Electrical and Computer Engineering Department course evaluation form. The evaluation instrument began with 17 multiple-choice questions (responses as letter grades A, B, C, D) that assessed the quality of the lectures, the student–faculty and student–teaching assistant (TA) relationships, and other topics, and ended with an area for written comments about the strengths and weaknesses of the course and the instructor. This evaluation was administered on one photocopied 8.5 11 sheet and delivered to the instructor and department head for evaluation without interpretation or calculation of averages. Evaluation results were not shared with students. Assessment tools from other schools were also studied. One from the Cooper Union, a Gateway Coalition partner, was especially interesting [6]. On this form, students assessed a set of “core skills,” which followed the spirit of the ABET 2000

0018–9359/00$10.00 © 2000 IEEE

126

IEEE TRANSACTIONS ON EDUCATION, VOL. 43, NO. 2, MAY 2000

TABLE I THE COOPER UNION CORE TECHNICAL STATEMENT FORMAT

TABLE II DREXEL FORMAT FOR A TYPICAL CORE TECHNICAL STATEMENT FROM A THIRD-YEAR ELECTRONIC DEVICES COURSE, FALL 1998–1999

Educational Criteria but were not a one-to-one match. Students also did self-assessment on a set of technical competencies, where the instructor, according to the course objectives, customized the questions for each course. The questions were presented as: “During this course I was provided with the opportunity to learn and practice the following technical skills:” and not at all and to a very rated on a scale of 1 to 5, where great extent (Table I). It was determined that our goal of developing a new outcomes-oriented evaluation tool could be met by placing more emphasis on questions that determine how successful a course is in delivering the needed technical material and less on “popularity contest” type questions. This was done by reducing the number of questions about the student–faculty and student–TA relationships and replacing them with questions that evaluate whether the technical objectives of the course had been met. We also wanted to use an evaluation procedure that would produce a more quantitative set of results in a simple way. A three-part structure was used on the trial evaluation form. The first part extended the Cooper Union concept of evaluating the technical core of the course one step further. Students would retrospectively evaluate their competency in the core technical objectives of the course before taking the course and after having taken the course (Table II). These ratings would be collected at the end of the term. This approach measured the student’s view on how well they learned the core technical objectives. The difference between the “before” and “after” responses is one measure of how much the students feel they have learned the core objectives of a specific course. In the second part of the form, students would assess how their course related to ABET-defined assessment criteria. The wording in this section was taken directly from the ABET En-

gineering Criteria 2000, Criterion 3 [7], commonly known as items A through K. While no single course was expected to cover all of the criteria, all 11 were presented to the students for their rating. Criteria that did not pertain to that course were ignored. The third part of the form is the student assessment of the course and the instructor(s). It is presented using multiple-choice (responses A through E) and open-ended questions. The final list of ten questions used was developed out of a compilation of questions that had been used in the various departments in the college for several years. The collection was edited until it was felt that the essential points were being covered with the minimum number of questions.

B. Fall 1998–1999 In the Fall quarter, the college created an Assessment Committee to look at the broader application of assessment techniques, to ensure its uniform and college-wide implementation, and to document the outcomes and resulting actions. The committee consists of the department head and a faculty member from each of the departments, the director and a faculty member from the tDEC, which manages the first two years of the engineering curriculum, and the college’s assessment officer. The evaluation tool was pilot tested on five courses from the Electrical and Computer Engineering Department. The course evaluations were done during class time in the last week of the quarter, using a set of photocopied questions and a mark-sense form for recording answers to the multiple-choice questions. Responses to the open-ended questions were put on the question forms or on the backs of the mark-sense forms.

SCOLES et al.: NEW COURSE EVALUATION PROCESS

127

Fig. 1. Student responses to the core technical statement in Table II.

Analysis and display of responses to the core technical questions (Fig. 1) shows one way that outcomes can be assessed. The right-shift in the response distribution indicates that this outcome, in the student’s opinion, has been achieved. C. Winter 1998–1999 In the Winter quarter, the prototype evaluation form was used in all 25 undergraduate courses in the ECE Department and a number of courses in other departments in the college (six civil and architectural, two chemical, four materials, four mechanical, and one tDEC). While the results provided by the evaluation form were good, doing a paper-based evaluation required a large amount of labor and expense to produce, distribute, and collect the evaluation forms, and to collect, score, and distribute the results. Scoring was done by Drexel’s Information Resources and Technology (IRT) group. While IRT provided rapid turnaround of results, the reporting format could not be customized, and data analysis was difficult. D. Spring 1998–1999 The labor overhead and expense of the paper-based evaluation process prompted an interest in using the Internet for course evaluation. The existing evaluation tool was converted to a Webbased form. Rather than developing our own Web-based evaluation process, the Assessment Committee received the software being used by Polytechnic University, another Gateway Coalition partner. With some modifications to Web pages and CGI (Perl) code, the Polytechnic survey method was adapted to our question and response format. Our faculty submitted their course objective questions through e-mail, on a template that was provided. Up to six questions were accepted. After inserting the questions into a form, the resulting Web pages were uploaded to the Web host. Evaluation questions for 31 engineering courses from four departments, representing 1252 student enrollments, were available for a period beginning two weeks before the end of the quarter and ending the week after final exams. A student would log into the system with their last name and social security number. They were then presented with a list of courses

for which they were registered and evaluations questions were available. Multiple-choice and typed responses went directly into a data base. Evaluation reports could be generated for faculty review from the stored data. The Web-based survey format was apparently successful. Students navigated the Web interface and the survey instrument with few skips and no logically inconsistent response patterns. The response rate, about 10%, was very low. The sample of responses is probably not representative of all students in the college, and is almost certainly nonrandom. For the core technical questions, tailored to each course’s content, students completing evaluations perceived substantial enhancement in self-rated competency. Overall course and instructor ratings were favorable on average. A large proportion of respondents left the overall instructor rating unanswered. Four survey items reflecting ABET 2000 criteria were emphasized to the greatest extent according to student respondents—ability to apply knowledge of math, science, and engineering; ability to design a system component or process; ability to identify and solve engineering problems; and ability to use engineering tools. Two ABET 2000 criteria were rated as least emphasized—ability to function on teams and understanding of professional and ethical responsibility. While the evaluation process worked well, the response rates from this first trial were very poor. One of the benefits of allowing students to complete the survey at their own pace was the depth of thought and detail given to the open-ended questions compared to the in-class evaluation. E. Summer 1998–1999 This summer, an improved and simplified version of the form will be used to encourage higher participation rates. An example of the format used for the core technical questions and the course and instructor assessment is shown in Table III. The ABETrelated questions will be removed from the individual course surveys and will be provided as a separate survey to gauge the overall achievement of the ABET criteria by class. The ABET survey will be repeated each year to obtain feedback from each co-op cycle as well as each class. The results can then be used to do both longitudinal and lateral assessment.

128

IEEE TRANSACTIONS ON EDUCATION, VOL. 43, NO. 2, MAY 2000

TABLE III (a) WEB FORMAT FOR THE CORE TECHNICAL STATEMENTS FOR A THIRD-YEAR ELECTROMAGNETICS COURSE, SUMMER 1998–1999

To compare the participation rates between in-class versus Web-based course evaluation, the MEM Department will conduct evaluations of half of their courses on paper and half on the Web. This term, incentives (e.g., random drawing among student respondents for university bookstore credit or some electronic device) will also be used to increase participation. The elected student government officer for engineering, student dean of engineering, will join the Assessment Committee this quarter to both provide student representation and encourage higher student participation. III. DISCUSSION A. Successes Data are being generated that can be used to improve the delivery of our courses and to better meet the college’s educa-

tional objectives. In the process of developing the new evaluation form, faculty throughout the college have been involved in formalizing their educational objectives and making them more apparent to the students. The evaluation forms for all of the College of Engineering courses that have been reviewed to date can be found on the Web.1 The evaluation is given in a manner that is convenient to students and does not take classroom time since it is Web-based and can be accessed any time. B. Problems 1) Response Rate: a) Culture change: Drexel students are accustomed to doing paper-based evaluations, so a web-based process is a major culture change. The goal of the assessment revision 1See

http://www.ece.drexel.edu/eval.

SCOLES et al.: NEW COURSE EVALUATION PROCESS

129

TABLE III (Continued.) (b) WEB FORMAT FOR THE COURSE AND INSTRUCTOR ASSESSMENT, SUMMER 1998–1999

should be to make this process as natural as checking e-mail. This can be done though better form design, raising student awareness of the process through announcements and publicity, and by making students a partner in the CQI process by sharing the evaluation results. Faculty buy-in and support is crucial to the success of the new process. b) Privacy: Several students commented that they would not do a Web-based survey where their social security number was used because they feared retribution. These students received individual responses. Through an announcement on the evaluation login page, it was made clear that the social security number was only used to collect the list of courses the student

was taking and was not linked to the responses. It may take some time for student trust to develop. Our dean has given the Assessment Committee a charge to produce a course evaluation response rate of 80%. If this cannot be done through the Web, other methods will be considered, including returning to paper-based evaluations. 2) Distribution of Results: The discussion of how much, if any, of the evaluation results should be shared, with what constituencies (dean, department head, faculty, students, administration, etc.), and through what means is a sensitive one at Drexel and elsewhere [8]. Public and private institutions are publishing evaluation results. Some are restricted to on-campus

130

IEEE TRANSACTIONS ON EDUCATION, VOL. 43, NO. 2, MAY 2000

Fig. 2. Simplified flowchart of the course assessment CQI process in the College of Engineering.

distribution, while some can be read on the Internet. Drexel did publish evaluation scores in the 1970’s, and the reason this practice stopped is not known. It is difficult to see how student participation can be improved without making them full partners in the process. The College of Engineering faculty has agreed to share the averaged numerical results of the course assessments, but not the students’ comments, beginning with our Fall 1999–2000 quarter. Results will be posted to a Web page available only through the university intranet. This policy has also been adopted by Polytechnic University [9]. 3) Specification of the Course Objective Questions: The course objective questions and statements come from a very diverse faculty and cover a wide variety of course subjects and types. Developing the skills of the faculty in survey design and in wording the course objective questions will be an ongoing task. C. Using the Survey Results Fig. 2 shows a somewhat simplified view of how the student assessment of the course and faculty, and the faculty assessment of the course and the students, are combined and fed into the CQI loop. Evaluation of the assessment outcomes may result in the conclusion that changes are needed in curriculum design, teaching method, lab equipment, advising system, or some other aspect of the program. In these cases, an action plan will be documented and put into place.

evaluation of course and students. While any one of these techniques may not be unique to Drexel University, the combination provides a unique and useful process for continuously improving our academic programs. We believe that the sharing of the evaluations on the Web will, after a short period of time, build trust between faculty and students and seeing the results of the CQI process in action will lead to higher survey response rates [10]. Several key challenges remain, such as expanding the process to include all engineering courses, increasing the student response rate, achieving full faculty support for the process, and the details of Web dissemination of the results to all interested constituencies. The aim of the Assessment Committee is to continue this process and to meet all the major challenges by the end of academic year 1999–2000. However, we are cognizant of the fact that the process will require continuous monitoring and improvement as dictated by the established CQI principles. ACKNOWLEDGMENT The authors would like to thank the Gateway Engineering Education Coalition for their support; the members of the Drexel University College of Engineering Assessment Committee for their hard work; V. Adams, ECE Systems Administrator; and Dr. J. I*ngham at Polytechnic University for the course evaluation software and discussions on common evaluation problems. REFERENCES

IV. SUMMARY AND CONCLUSIONS A new course evaluation process has been established in the College of Engineering at Drexel University to meet the curricular and ABET 2000 outcomes assessment needs. This is the first time a formal, uniform, and regular course evaluation process is being implemented in the College of Engineering at Drexel. The process has been successfully implemented on the Web and a CQI feedback loop has been initiated. Three of the major accomplishments in the assessment process are evaluation of progress toward meeting course objectives with questions specified by the teaching faculty, Web delivery of the assessment tool to students, and faculty

[1] N. Bilgutay and R. Mutharasan, “The Drexel engineering curriculum: From E4 experiment to gateway institutionalization,” in Proc. Engineering Foundation Conf., E. W. Ernst and I. C. Peden, Eds., Baltimore, MD, June 3–6, 1998, pp. 66–75. Realizing the New Paradigm for Engineering Education. [2] E. Fromm and R. Quinn, “An experiment to enhance the educational experience of engineering students,” Eng. Educ., pp. 424–429, Apr. 1989. [3] , “An enhanced educational experience for engineering students,” in Proc. Engineering Foundation Conference, M. E. VanValkenberg, Ed., Aug. 1989, pp. 15–30. [4] R. Quinn, “Drexel’s E4 program: A different professional experience for engineering students and faculty,” J. Eng. Educ., vol. 82, no. 4, Oct. 1993. [5] K. Scoles and N. Bilgutay, “ECE 21: A new curriculum in electrical and computer engineering,” in Proc. 29th ASEE/IEEE Frontiers in Education Conf., San Juan, Puerto Rico, Nov. 1999, pp. 12b5-10–12v5-14.

SCOLES et al.: NEW COURSE EVALUATION PROCESS

[6] Evaluation Form for EID 101 Engineering Design and Problem Solving, The Cooper Union, Fall 1997. [7] The Latest Version of the ABET Educational Criteria 2000, Criteria for Accrediting Programs of Engineering in the United States [Online]. Available: http://www.abet.org/eac/eac.htm [8] R. E. Haskell, “Academic freedom, tenure, and student evaluations of faculty: Galloping polls in the 21st century,” Educ. Policy Anal. Archives, vol. 5, no. 6, Feb. 12, 1997. [9] A. Polevoy and J. Ingham, “Data warehousing: A tool for facilitating assessment,” in Proc. 29th ASEE/IEEE Frontiers in Education Conf., San Juan, PR, Nov. 1999, pp. 11b1-7–11b1-11. [10] S. S. Cook, “Improving the quality of student ratings of instruction: A look at two strategies,” Res. Higher Educ., vol. 30, no. 1, pp. 31–45, 1989.

Kevin Scoles (M’86) received the B.S. degree from Union College, Schenectady, NY, in 1977 and the Ph.D. degree from Dartmouth College, Hanover, NH, in 1982, both in physics. He joined the Faculty of the Electrical and Computer Department, Drexel University, Philadelphia, PA, in 1982 and is currently an Assistant Professor and Associate Department Head for Undergraduate Affairs. His teaching assignments involve courses dealing with solid-state physics aspects of semiconductors and devices; analog, digital, and mixed-signal circuit design and analysis; and departmental laboratory courses integrating electrical and computer engineering concepts. He is performing experimental research involving the design, fabrication, and testing of biomedical hybrid circuits. Dr. Scoles received the ASEE Dow Outstanding Young Faculty Award (Middle-Atlantic Region) in 1987 and the Martin Kaplan Distinguished Faculty Award of the Drexel ECE Department in 1998.

131

Nihat Bilgutay (S’71–M’78–SM’89) received the B.S.E.E. degree from Bradley University, Peoria, IL, in 1973 and the M.S.E.E. and Ph.D. degrees from Purdue University, West Lafayette, IN, in 1975 and 1981, respectively. Dr. Bilgutay joined Drexel University in 1981, where he is currently a Professor and Head of the Electrical and Computer Engineering Department. He has taught a wide array of graduate and undergraduate courses in systems and circuits, communications, and signal processing. He is also active in the national effort to restructure and reform engineering education. He has played a leadership role in the Gateway Engineering Education Coalition sponsored by the National Science Foundation. He is currently a member of the Governing Board and serves as the Gateway Institutional Activities Leader at Drexel University. He also served as the PI for the NSF-funded Graduate Engineering Education Fellowship Program at Drexel University for Women and Minorities between 1992–1999. Dr. Bilgutay is a member of ASNT, ASEE, Tau Beta Pi, Eta Kappa Nu, and Sigma Xi.

Jerene Good was born in California. She received the B.A. degree from Callison College, University of the Pacific, Stockton, CA, and the M.A. degree from the University of Pennsylvania, Philadelphia, in peace science. She is Director of Evaluation and Assessment for the College of Engineering at Drexel University, Philadelphia. She has 20 years of quantitative social science and survey research experience, including five years as university director of institutional research. She studied in Bangalore, Karnataka, India, and Merida, Yucatan, Mexico, as part of her undergraduate experience. Her work in higher education includes statistical and data consulting for faculty and students at Princeton University and at Villanova University. She has served as Librarian for lower and middle school and been a substitute teacher in sixth grade.