Measuring Student Response to Instructional Practices ... - Asee peer

15 downloads 298 Views 1MB Size Report
(STEM) Education department at University of Texas at Austin. ... mental Engineering both from Texas Tech University. ..... Every one of my college courses. 5.
Paper ID #14971

Measuring Student Response to Instructional Practices (StRIP) in Traditional and Active Classrooms Mr. Kevin A. Nguyen, University of Texas, Austin Kevin Nguyen is currently a Ph.D. student in the Science, Technology, Engineering, and Mathematics (STEM) Education department at University of Texas at Austin. He has a B.S. and M.Eng in Environmental Engineering both from Texas Tech University. As an engineering education researcher, he has worked on projects regarding self-reflection, teamwork, active learning, and participatory science communities. Dr. Maura J. Borrego, University of Texas, Austin Maura Borrego is Associate Professor of Mechanical Engineering and Curriculum & Instruction at the University of Texas at Austin. She previously served as a Program Director at the National Science Foundation and an associate dean and director of interdisciplinary graduate programs. Her research awards include U.S. Presidential Early Career Award for Scientists and Engineers (PECASE), a National Science Foundation CAREER award, and two outstanding publication awards from the American Educational Research Association for her journal articles. Dr. Borrego is Deputy Editor for Journal of Engineering Education and serves on the board of the American Society for Engineering Education as Chair of Professional Interest Council IV. All of Dr. Borrego’s degrees are in Materials Science and Engineering. Her M.S. and Ph.D. are from Stanford University, and her B.S. is from University of Wisconsin-Madison. Dr. Cynthia Finelli, University of Michigan Dr. Cynthia Finelli is Associate Professor of Electrical Engineering & Computer Science, Research Associate Professor of Education, and Founding Director of the Center for Research on Learning and Teaching in Engineering at the University of Michigan. Her research areas include student resistance to active learning, faculty adoption of evidence-based teaching practices, and institutional change. She is a fellow in the American Society of Engineering Education, an Associate Editor of the IEEE Transactions on Education, and past chair of the Educational Research and Methods Division of ASEE. Mr. Prateek Shekhar, University of Texas, Austin Prateek Shekhar is a doctoral candidate in the Department of Mechanical Engineering at The University of Texas at Austin. His research is focused in understanding students’ and faculty’s reaction to adoption of active learning based teaching methods in engineering classrooms. He holds a M.S. in Electrical Engineering from University of Southern California and B.S. in Electronics and Communication Engineering from India. Mr. Robert Matthew DeMonbrun, University of Michigan Matt DeMonbrun is a Ph.D. Candidate at the Center for the Study of Higher and Postsecondary Education (CSHPE) in the School of Education at the University of Michigan. His research interests include college student development theory and teaching practices and how they relate to student learning outcomes in engineering education. Dr. Charles Henderson, Western Michigan University Charles Henderson, PhD, is a Professor at Western Michigan University (WMU), with a joint appointment between the Physics Department and the WMU Mallinson Institute for Science Education. He is the co-founder and co-director of the WMU Center for Research on Instructional Change in Postsecondary Education (CRICPE). His research program focuses on understanding and promoting instructional change in higher education, with an emphasis on improving undergraduate STEM instruction (see http://homepages.wmich.edu/˜chenders/ for details). In spring 2010, he was a Fulbright Scholar with the Finnish Institute for Educational Research at the University of Jyv¨askyl¨a, Finland. Dr. Henderson was a member of the National Research Council Committee on Undergraduate Physics Education Research and Implementation and is the senior editor for the journal Physical Review Special Topics - Physics Education Research. c

American Society for Engineering Education, 2016

Paper ID #14971

Dr. Michael J. Prince, Bucknell University Dr. Michael Prince is a professor of chemical engineering at Bucknell University and co-director of the National Effective Teaching Institute. His research examines a range of engineering education topics, including how to assess and repair student misconceptions and how to increase the adoption of researchbased instructional strategies by college instructors and corporate trainers. He is actively engaged in presenting workshops on instructional design to both academic and corporate instructors. Dr. Cindy Waters, North Carolina A&T State University Dr. Cynthia Waters is an assistant professor in the Mechanical Engineering and she specializes in porous metals for biological and transportation applications, and engineering education. Dr. Waters’ research expertise is in the creation and characterization of metallic foams and porous metals for the future of applications ranging from space exploration to biomedical implants. These metals display a high density to strength ratio and improved ability for energy absorption, which leads to usefulness in many applications. Dr. Waters is also known for her engineering education efforts. She has past and current NSF funding with several facets of engineering education and these include: Assessment studies of classroom material science pedagogical implementations; Just in Time Teaching with Web-based Tools of Material Science; Case Studies in Material Science and Various Engineering Disciplines and; Engineering Faculty Barriers to Adopt Evidence-Based (or nontraditional) Teaching Methods. She has been invited to speak at conferences (MRS, MS&T, and ASEE) worldwide on the topic of Material Science education. She serves as the College of Engineering liaison to ASEE and advises the Society of Women Engineers student chapter and leads the students in developing and implementing yearly outreach events for the K-8 female community. She is author of many peer-reviewed conference proceeding and journal papers in the areas of both porous metals and engineering education.

c

American Society for Engineering Education, 2016

Measuring Student Response to Instructional Practices (StRIP) in Traditional and Active Classrooms Abstract Although some engineering instructors have adopted active learning in their teaching, many have not because they anticipate experiencing student resistance to it. This research paper explores how students resist active learning. As part of ongoing research, we have developed a survey instrument, the Student Response to Instructional Practices (StRIP) Survey, to measure undergraduate engineering students’ resistance to active learning. While our other publications describe the instrument development procedures in more detail, the purpose of this paper is to understand how students in traditional and active learning classrooms respond differently to the survey questions. At a large public institution in the Southwestern United States, we studied three undergraduate engineering courses. One course in mechanical engineering served as the traditional course (n = 67 students), while the two other courses in electrical (n = 31) and mechanical engineering (n = 53) incorporated active learning (one used primarily group problem solving, while the other incorporated individual problem solving). Using quantitative methods, we conducted a Kruskal-Wallis test to determine if there were statistically significant differences between student responses in the three courses. Post-hoc testing determined which courses were statistically significantly different from each other for each survey item. The StRIP Survey successfully differentiated the two active learning courses (individual-level and group-level active learning). The StRIP Survey data also suggested that, on average, students respond positively to in-class activities and do not appear resistant. The StRIP Survey will continue to be used in more active learning classrooms, and future work will model student resistance. Introduction, Rationale, and Research Questions Over the last several decades researchers and curriculum developers in undergraduate STEM education have made enormous strides in developing knowledge about effective teaching and learning practices. For example, a recent meta-analysis of 225 studies found that, within all STEM disciplines, active learning instructional strategies consistently increase student learning and decrease failure rates1.Yet, the presence of this body of knowledge has not translated into widespread change in teaching practices. Student resistance—and the fear of student resistance—are among the biggest barriers that instructors face when implementing active learning2-4. Student resistance, however, is just one possible affective response that students can have to active learning. Student reactions to teaching methods are not always negative, and the literature on student affective responses to nontraditional teaching methods is largely inconclusive5-11.The goal of our overarching research project is to explore the various student responses to active learning instructional practices and to contribute to an area of research that has not been well documented. In order to properly explore

student responses, we developed the Student Response to Instructional Practices (StRIP) Survey to collect empirical data specifically focused on student resistance/responses. More traditional instructional practices are also included in the StRIP Survey to cover the breadth of instructional practices and their effects on student responses. This paper reports one aspect of our overarching project and explores the preliminary results of administering the StRIP Survey in traditional and active courses. Although the StRIP Survey was developed to explore classrooms that incorporated different types of nontraditional teaching methods, the research team was also interested in the results of using the StRIP Survey in a more traditional classroom. This research paper adds to our work on validating the StRIP Survey, as this paper compares student responses in traditional and active learning undergraduate engineering classrooms, and the data presented is drawn from a subset of data collected to pilot test our survey StRIP Survey. By administering the survey in both traditional and active courses, we will better understand how accurately students use the instrument to report the type of instruction they experience, how well the instrument captures student reactions to episodes of active learning instruction, and how effectively the StRIP Survey characterizes different types of instruction. The research questions are then:  

How does the Student Response to Instructional Practices (StRIP) Survey differentiate between traditional and various types of active learning courses? What statistically significant conclusions can we make about the differences between the traditional and various types of active learning courses using the StRIP Survey?

The remainder of the paper starts with a brief literature review of student resistance with definitions of important terms. Second, an overview of the StRIP Survey development process is provided, followed by a description of the quantitative methods used to explore the research questions, an account of the research population, and in-depth view of the StRIP Survey. Statistical significance results are then presented, and the results are followed by a discussion on the interpretation and implications of the results. Survey and study limitations are noted. Finally, the major conclusions of the paper are presented with directions/suggestions for future work. A Review on Student Resistance Prior research on faculty decisions about their teaching practices12-17 has identified a number of instructor-reported barriers to the use of nontraditional teaching methods, including: (a) concerns about student resistance, (b) questions about the efficacy of the techniques, (c) concerns about preparation time, (d) concerns about ability to cover the syllabus. This project focuses on student resistance as the barrier in most need of additional research. While faculty concerns about the efficacy of active learning methods and other nontraditional teaching methods are a legitimate barrier, this efficacy has been documented exhaustively1,18-19 and requires significantly less additional research. Similarly, faculty concerns about both preparation and

class time have been addressed convincingly in the literature20-23, so student resistance is an important remaining barrier to address. Student resistance describes the negative student response or pushback to new instructional methods. According to Weimer, this student resistance takes a number of forms24. Passive, non-verbal resistance is characterized by an overwhelming lack of enthusiasm and may manifest as a failure to respond to instruction or refusal to participate in class activities. Students may also show resistance through partial compliance, symptoms of which include doing activities quickly in an attempt to get them over with and asking questions focused on what the instructor wants the student to do. Finally, students may show open resistance, characterized by emotional complaints, arguments, and objection. Freeman et al., define active learning as instruction that “engages students in the process of learning through activities and/or discussion in class, as opposed to passively listening to an expert1. It emphasizes higher-order thinking and often involves group work.” Prior research has confirmed faculty concerns that students do not always appreciate active learning teaching methods16. Weimer ascribes student resistance to the fact that active learning methods generally require more work on the part of the student, cause anxiety about students’ ability to succeed in a new environment, and occasionally set expectations that students are not yet prepared to meet24. In addition, students may exhibit resistance because they judge (incorrectly) that active learning methods lead to decreased learning. In two different studies, students perceived that they did not learn as much from group discussions25 and problem-based learning26 as they would have from traditional lectures, despite assessment evidence to the contrary. Active learning instruction is typically presented in contrast to traditional instruction, which is characterized as passive, lecture-based, and instructor-focused1,27-29. Traditional instruction is generally expected by students and does not push students out of their comfort zone of being passive observers during class30. Thus, much less student resistance should be expected in a traditional class than in an active learning class. StRIP Survey Development The development process for the StRIP Survey involved six different phases; three of which are discussed in this paper: item generation, validity testing, and piloting of our protocol. During the initial development process of our instrument, we conducted an extensive literature review on the areas of non-traditional instructional practices (e.g., active learning), student resistance/response, and strategies to reduce negative student response, yielding an initial survey and observation protocol for exploring the effects of different types of classroom activities on student behaviors31.

During the validation phase, our classroom observation protocol allowed us to qualitatively explore our initial hypotheses surrounding student response to instructional practices to determine if our survey effectively and accurately captured the phenomena we observed in the classroom. Thus, the classroom observation protocol was only used in the initial stages of our project, after which the survey instrument was designed to capture information about students’ responses to instructional practices in multiple types of classrooms at various institutions. We also subjected our protocol to expert review from three prominent researchers in the fields of active learning, student resistance, instrument design and psychometrics, and we conducted cognitive interviewing with student focus groups to assess the accuracy of the responses to our instrument items. Early qualitative results from the observation protocol indicated that they are variations in student engagement levels with the type of active learning (group versus individual) implemented in the classroom31. Following our initial survey validation process, we piloted our protocol during the fall 2014 and spring 2015 terms. Our pilot testing included eight courses at four different institutions. Our pilot instrument asked students to describe both several aspects of the instruction they experienced (to allow for documentation of the nature of the class overall) as well as their reactions to episodes of active learning instruction. The StRIP Survey was administered three times throughout the semester, but this research paper only reports on the final survey given at the end of semester because it includes the responses about actual active learning experienced by students and their responses to it. More work will be done understanding the effects of administering the StRIP survey three different times throughout a course. For the pilot study, the research team chose both active learning and traditional instruction courses (i.e., mainly lecturestyle and instructor-focused practices). The inclusion of courses with both traditional and nontraditional instructional practices was useful for two reasons. First, since our instrument was designed to capture student response (i.e., positive, negative, and neutral response), rather than simply student resistance, to instructional practices, it was important to have both types of courses included in our pilot testing so that we could conduct exploratory and confirmatory factor analyses on our constructs of student response. Second, including both types of courses would allow us to further examine the differences in students’ behaviors across the different types of courses, which is the focus of this paper. Population, Survey Instrument, and Quantitative Methods The pilot study presented in this paper was performed at a large public institution in the southwestern United States during the spring 2015 semester. The results from just one institution are analyzed to minimize variation effects from different institutional settings. Three courses were sampled: a mechanical engineering dynamics class (n = 67), a mechanical engineering programming and computer methods course (n = 31), and an electrical engineering introduction to computing course (n = 53). The n-values represent the total number of students who

completed the StRIP Survey for that course. The dynamics course served as the “traditional” lecture course. The mechanical engineering programming and computer methods course was one of the active courses, and this course was unique in that the active learning strategies used included a high amount of “individual” problem solving. The electrical engineering introduction to computing course also served as one of the active courses, and this course contained a high amount of “group” problem solving. For the remainder of the paper, these courses will be referred to as “traditional,” “individual,” and “group.” Preliminary qualitative observations of the courses verified that these courses fit these characteristics, and the course instructors also confirmed these characteristics reflected their type of teaching. The three courses sampled are summarized in Table 1. The traditional and individual courses were predominantly taken by second year students, while the group course was taken by predominantly first year students (Table 1). All three courses contained predominantly men; the overall percent breakdown is provided in Table 1. Table 1: Summary of Engineering Courses Sampled Student Year Department - Course Title

Key Identifier

Student Gender

n 1st

2nd

3rd

4th

5th

Men

Women

Mechanical Engineering Dynamics

“Traditional”

67

1%

76%

19%

4%

2%

79%

21%

Mechanical Engineering Programming and Computer Methods

“Individual”

31

3%

78%

17%

2%

3%

85%

15%

“Group”

53

67%

19%

12%

2%

0%

78%

22%

Electrical Engineering Introduction to Computing

The StRIP Survey contained multiple sections and took an average of fifteen minutes to complete. The first section focused on students’ responses to instructional practices or in-class activities (Figure 1), the second section focused on how the students perceived the administration of in-class activities by the instructor (Figure 2), the third section directly asked students to measure overall satisfaction with the course and the instructor (Figure 3), the fourth and fifth sections asked students what final grade they expected in the course and the number of courses they have already taken with in-class activities (Figure 4), and the sixth section asked students to report the frequency of in-class activities and indicate which in-class activities would comprise their ideal course (Figure 5).

Figure 1: Responses to Instructional Practices Section

Figure 2: Administration of In-Class Activities by the Instructor Section

Figure 3: Overall Course and Instructor Evaluation Section

Figure 4: Final Grade Expectations and Prior In-Class Activities Section Students responded using paper versions of the StRIP Survey, and we tabled the results in Microsoft Excel. Missing answers were coded as “0” and removed from statistical analysis. Answers were predominately coded on a numerical scale from 1 to 5 for most of the survey sections. For section 4, a numerical score for their expected grade was calculated with the scheme shown in Table 2. Also for section 4, numerical scores for the number of prior course with in-class activities were coded to a scale from 1 to 5 shown in Table 3. Quantitative data analysis was performed in R version 3.2.2. In order to analyze categorical Likert data, non-parametric tests were used. We used the Kruskal-Wallis test to determine if there was a difference in distribution between any of the three courses and a Posthoc Nemenyi test with Tukey distribution to determine which courses were statistically significantly different. For the majority of the StRIP Survey results, only statistically significant difference items are presented to compare the three courses.

Figure 5: Actual and Ideal Instructional Practices Section

Table 2: Expected Letter Grade and Numerical Code Key Letter Grade A+ A AB+ B BC+ C CD+ D DF

Numerical Code 13 12 11 10 9 8 7 6 5 4 3 2 1

Table 3: Amount of Prior Courses with In-Class Activities Code Key Amount of Prior Courses Every one of my college courses Almost all of my college courses About half of my college courses A few of my college courses None of my college courses

Numerical Code 5 4 3 2 1

Results of Administered Survey To allow a more coherent narrative, here we present results are presented in a different order than the StRIP Survey section order. Amount of Prior Classes Taken with In-class Activities To explore a possible alternative explanation of our results, we asked students the amount of prior classes they have taken with in-class activities or active learning components (Figure 4). There was no statistically significant difference between the three courses using the KruskalWallis test (p = 0.2675, Χ2 = 2.64, df = 2). All three groups had a mean score around 3 with a standard deviation of 1, indicating that students encountered active learning in “about half of my college courses.” Thus, students in all three courses had roughly the same amount of active learning experiences.

Different In-Class Instructional Practices Table 4: Actual In-Class Activities Statistically Significant Items Only (10 of 21 Total Items) Group

Traditional

Individual

Kruskal Wallis DF

Post-Hoc Identifier

53.35

2

All

0.000

17.85

2

Group

0.98

0.006

10.31

2

Group

3.35

1.08

0.000

15.43

2

Group

1.14

1.81

1.14

0.000

39.22

2

Group

2.76

1.28

3.32

1.11

0.000

27.18

2

Group

0.95

2.53

1.01

3.45

1.06

0.000

15.92

2

Individual

3.53

1.17

1.92

1.09

2.87

1.26

0.000

42.34

2

Traditional

Be graded on my class participation.

3.81

1.06

1.56

1.04

3.06

1.39

0.000

66.85

2

Traditional

Preview concepts before class by reading, watching videos, etc.

2.62

1.15

2.14

1.15

2.13

1.23

0.040

6.45

2

Weak

In-Class Activity Item

Mean

SD

Mean

SD

Mean

SD

p-value

Χ

Solve problems in a group during class.

4.15

0.69

2.64

1.13

3.48

0.81

0.000

3.17

1.19

2.25

1.23

2.20

1.24

3.83

1.03

4.30

0.84

4.35

3.85

0.93

3.23

0.92

3.13

1.02

1.80

3.96

0.92

2.60

Work in assigned groups to complete homework or other projects. Watch the instructor demonstrate how to solve problems. Discuss concepts with classmates during class. Be graded based on the performance of my group. Solve problems that have more than one correct answer. Solve problems individually during class. Do hands-on group activities during class.

2

We asked students to report how frequently in-class instructional activities occurred in the classroom under study (Figure 5). Table 4 presents the ten survey items for which student responses were statistically significantly different (Kruskal-Wallis test) for the three class types and provides an identifier (Post-hoc Nemenyi test) to indicate where the difference occurred. For the activity, “solve problems in a group during class,” we see that there is a statistically significant difference between the classes (p < 0.001, Χ2 = 53.35, df = 2) and that the difference appears for all courses when compared pairwise (Post-hoc Nemenyi test; p < 0.05). As illustrated

on the table, eight of the other activities are significantly related to either group, individual, and traditional courses, and the last item, “preview concepts before class by reading, watching videos, etc.” shows no statistically significant differences between any two courses. Thus, these analyses verify that the StRIP Survey was able to correctly characterize the individual and group problem solving courses. Surprisingly, the traditional course did not pick up the post-hoc identifier for the more traditional instructional practice items such as: “listen to the instructor lecture during class,” “get most of the information needed to solve the homework directly from the instructor, and “watch the instructor demonstrate how to solve problems.” “Listen to the instructor lecture during class” and “get most of the information needed to solve the homework directly from the instructor” inclass activity items are not on Table 4 since these two items were not statistically significantly different between the three courses (p > 0.05) which means that these items were similarly applied between the three courses. Overall, the group and individual problem solving courses appeared to be correctly identified, but the traditional course could have more items suggesting the more lecture based nature of the course. However, at least some amount of regular lecturing is to be expected in any type of engineering course. Differences in Students’ Ideal Course Students were also asked to describe their ideal course (Figure 5). Only three items demonstrated statistically significant difference (“Be graded on my class participation,” “Take initiative for identifying what I need to know,” and “Solve problems that have more than one correct answer.”), but none of them showed pairwise differences. Thus, this item is not helpful for our current analysis. Evaluation of the Course and Instructor Students were asked to evaluate both the course and their instructor (Figure 3). There were statistically significantly differences between the three courses (Table 5). Table 5 provides the results of the two survey items. Overall, all students evaluated both the course and instructor highly. Table 5: Evaluation of Course and Instructor Items Group

Traditional

Individual

Kruskal Wallis

Evaluation Item

Mean

SD

Mean

SD

Mean

SD

p-value

Χ2

DF

Overall, this was an excellent course. Overall, the instructor was an excellent teacher.

4.25

0.78

4.03

0.94

4.48

0.81

0.032

6.894

2

4.55

0.64

4.39

0.80

4.81

0.60

0.010

9.264

2

Expected Grades One way to describe each of the courses was to ask students, “what final grade do you expect to receive in this course?” (Figure 4). The Kruskal-Wallis test depicted a statistically significant difference between the groups (Χ2 = 0.032, p = 0.032, df = 2). Table 6 presents the average expected grade by course. Table 6: Students’ Expected Grades Group Item What final grade do you expect to receive in this course?

Traditional

Individual

Kruskal Wallis

Mean

SD

Mean

SD

Mean

SD

p-value

Χ2

DF

10.15 (B+)

1.94

9.31 (B)

2.21

11.18 (A-)

1.21

0.032

6.894

2

Administration of In-Class Activities In order to determine how in-class activities were administered for each course, students were asked to rate the administration of the in-class activities by the instructor (Figure 2). There were a few statistically significantly different items, and these items are listed in the Table 7. However, none of these items showed significant pairwise differences. Table 7: In-Class Activity Administration Statistically Significant Items Only (5 of 8 Total Items) In-Class Activity Administration Item Clearly explained what I was expected to do for the activity Clearly explained the purpose of the activity. Walked around the room to assist me or my group with the activity, if needed Encouraged students to engage with the activity through his/her demeanor Gave me an appropriate amount of time to engage with the activity.

Group Mean SD

Traditional Mean SD

Individual Mean SD

Kruskal Wallis p-value Χ2 DF

4.08

0.76

4.33

0.79

4.65

0.55

0.001

13.325

2

4.00

0.92

4.12

0.93

4.52

0.93

0.008

9.653

2

4.38

0.88

3.69

1.14

4.58

0.72

0.000

21.659

2

4.23

0.85

3.99

1.02

4.58

0.92

0.003

11.749

2

3.57

0.84

3.78

1.06

4.42

0.96

0.000

18.943

2

Student Response to In-class Activities Finally, results from asking students how they responded to in-class activities are provided (Figure 1). Table 8 provides statistically significant items for students’ responses to in-

class activities. Table 8 items are categorized by whether or not the student response to in-class activity item was positive or negative. Although there were differences in variance between the three courses, no post-hoc identifiers were found linking one of these items to a course, as no course was statistically significantly (p < 0.05) different from both the other two courses. Overall, it appears that students do respond positively to in-class activities, as means scores are higher for positive type items and lower for negative type items. Table 9 provides the remaining response to in-class activity items, and these items are not statistically significant in terms of the distribution of scores between all three courses. However, Table 9 is provided to show that the general trend of positive reactions to in-class activities also carries through to the remaining student response to in-class activities items. Table 8: Response to In-Class Activities Statistically Significant Items Only (7 of 15 Total Items) Group

Traditional

Individual

Kruskal Wallis

Response to In-Class Activity Item

Mean

SD

Mean

SD

Mean

SD

p-value

Χ2

DF

Response Type

I pretended but did not actually participate

1.75

0.85

2.16

1.05

1.62

0.86

0.018

8.009

2

Negative

I talked with classmates about other topics besides the activity

3.17

1.45

2.81

1.16

2.35

1.25

0.029

7.077

2

Negative

3.83

0.87

3.31

0.89

3.71

1.04

0.002

12.131

2

Positive

4.06

0.72

3.47

1.07

3.58

1.31

0.011

9.073

2

Positive

4.45

0.72

4.18

0.78

4.52

0.77

0.028

7.136

2

Positive

I felt the effort it took to do the activity was worthwhile I participated actively (or attempted to) I felt the instructor had my best interests in mind

Table 9: Response to In-Class Activities Non-Statistically Significant Items Only (8 of 15 Total Items) Group

Traditional

Individual

Kruskal Wallis DF

Response Type

2.054

2

Negative

0.410

1.785

2

Negative

0.74

0.153

3.756

2

Negative

1.42

0.72

0.101

4.577

2

Negative

0.92

4.29

0.90

0.056

5.775

2

Positive

3.88

1.04

4.10

0.87

0.604

1.009

2

Positive

0.91

3.85

0.80

4.00

1.00

0.406

1.804

2

Positive

0.93

3.19

1.09

3.39

1.09

0.141

3.923

2

Positive

Response to In-Class Activity Item

Mean

SD

Mean

SD

Mean

SD

p-value

Χ

I disliked the activity and voiced my objections.

1.40

0.72

1.46

1.36

1.26

0.73

0.358

I focused on doing specifically what the instructor asked, rather than on mastering the concepts.

3.17

0.99

2.88

0.96

3.06

1.00

I rushed through the activity, giving minimal effort

2.31

0.92

2.58

0.94

2.29

1.75

0.87

1.71

0.76

4.19

0.71

3.90

I tried my hardest to do a good job

4.04

0.85

I felt the time used for the activity was beneficial

3.98

I enjoyed the activity

3.60

I distracted my peers during the activity I felt positively towards the instructor/class

2

Discussion and Interpretation of Results The StRIP Survey was successful in identifying the group and individual problem solving active learning classrooms. The group problem solving course was identified with five StRIP Survey in-class activities items, and the most telling of these items were “solve problems in a group during class,” “work in assigned groups to complete homework or other projects,” “discuss concepts with classmates during class,” and “be graded based on the performance of my group” (Table 4). The individual problem solving course was only identified with one item: “solve problems individually during class” (Table 4). These results support the ability of the StRIP survey instrument to distinguish between different types of active learning classrooms, as the group problem solving course reported a higher average occurrence of group related items, and the individual problem solving course had a higher average of individual problem solving.

Building on this work, further analysis will combine these in-class instructional StRIP Survey items into factors such as group activities, student led activities, passive instruction, and active based instruction. However, the StRIP Survey was not strongly able to differentiate the traditional course from the other two courses in terms of actual instructional practices. Although this may appear to be a loss, the StRIP Survey was not developed for the purpose of being administered in traditional classrooms. Rather, the StRIP Survey was developed to explore student resistance to active learning in undergraduate engineering classrooms. Since the traditional course did not appear very traditional in the StRIP Survey results, we looked for additional validation that the traditional course was actually lecture based in nature. We asked the instructor at the end of the semester about the course, and the instructor of the traditional course re-stated that the course is mainly lectured based with example problems related to his own research. The instructor of the traditional course made sure he learns the names of his students and encourages his students to do well. It is not surprising then that the students responded and evaluated his course well (Table 5). In any case, it appears that the traditional course in this data set is an example of a particularly good lecturer and may not be generalizable to typical traditional courses and did not warrant negative student responses, if any. It is also possible that our careful item wording of “when the instructor asked you to…” caused students to focus their responses to the few instances in the traditional course of when the instructor used in-class activities. The data reveal some interesting patterns about how students react to class activities in these different learning environments. In this data set, students responded positively to in-class activities in all three courses. This is an important finding for faculty concerned that students will view active learning as a departure from the instructor’s teaching responsibility. Some instructors are concerned that students feel that they are paying (tuition) for the instructor’s expertise, not to learn from other students or to engage in learning activities they could do just as easily outside of class24. None of these concerns were supported by the current data set. One might say that the traditional course did not implement very much or any in-class activities and still received very positive results. It may be the case that the traditional course instructor was an excellent instructor, and students responded well regardless of what the instructor did, but these preliminary results still suggest that students do not negatively respond or resist in-class activities as much as faculty members and instructors have originally thought. Other sections of the StRIP Survey provided additional information that can be related back to student responses and administered instructional practices. Information such as expected grades, ideal instructional practices, prior amount of classes taken with in-class activities, and how in-class activities were administered by the instructor can be correlated back to how students responded to instructional practices. These correlations were not tested in this preliminary analysis, but it is important to note that these information will be useful in further

detailed statistical analysis. Although some information can be gleaned from these sections, such as students in group and individual problem solving courses expected higher grades on average than the traditional course (Table 6), none of these sections provided compelling significant differences between the three courses when pairwise testing was used. In terms of drawing conclusions between the three courses, the strongest differences lie in terms of actual instructional practices. Although there are statistically significant differences between the three courses in other sections of the StRIP Survey, these differences are not strong enough to make conclusions that one course is much different than the two other courses in terms of student reactions, instructor strategies, or student characteristics. Thus, this research paper does not claim that traditional and active courses are better or worse than each other, but rather, this research paper suggests that the StRIP Survey can be used in active learning courses to differentiate types of in-class instructional practices as well as support for the lack of student resistance to in-class activities. Survey Limitations Several limitations should be acknowledged. Generalizability of the results is limited due to the relatively small sample size, low number of courses and engineering disciplines, and single institution setting. Although some frameworks exist for characterizing undergraduate STEM instruction, theories and instruments are not particularly well-developed to support claims about the types of instruction (traditional, group active and individual active) we studied and how representative they are of engineering instruction nationally. Similarly, the sample is not large enough to understand the effects of varying instructor rank/experience level and other course characteristics including laboratory and recitation sections. Nonetheless, we reported results for three different courses and explored alternative explanations which lay the foundation for future work. The StRIP Survey is still in iterative development to refine factors to describe instructor strategies, student participation and other outcomes. Here, we analyzed and reported the results for individual survey items, which might overestimate differences between the courses. More work will be done to at the factor and construct level in order to explore the different modalities of student resistance and responses to active learning. Collapsing multiple survey items into a factor will allow more reliable and valid statistical conclusions, and currently we are exploring four types of student responses to active learning: participation, value, emotion, and evaluation. We focused our discussion on statistically significant differences, which may ignore other important differences. Results only analyze post survey data only; more information may be gleaned from analyzing survey data from earlier in the semester (which is ongoing).

Conclusion of Study and Future Work Understanding and reducing student resistance and other negative responses to in-class active learning was the rationale for our project, and this pilot study allowed us to explore how the StRIP Survey performs in traditional and active classrooms. Our research data suggests that the StRIP Survey can differentiate types of active learning courses (e.g., using mostly individual or group-based activities), but it cannot easily identify traditional lecture classes. However, since our instrument is designed for studying active learning courses, this is not critical. Our research also suggests that students responded positively to in-class activities, whether they were implemented in an active or traditional course. This is an important finding to add to the literature about student resistance, since this indicates that students are not as resistant as previously believed. In addition, these are important findings for faculty concerned that students will view active learning as shirking teaching responsibility. Testing the differences between traditional and active learning courses using the StRIP Survey was important for validating our future results and for understanding how students answer the StRIP Survey. The power and utility of the StRIP Survey lies in its ability to be used in active learning courses and to describe the types of responses students have to in-class instructional practices. We will continue to administer the StRIP Survey in active learning classrooms to better gauge the different types of responses in other active learning classrooms. In the next phase of our study, we plan to collect data from introductory engineering courses across 20 institutions in US. These 20 courses will provide a diverse set of classrooms differing in several aspects including but not limited to institution type, implemented active learning techniques, and class size. Further data analysis will use factor analysis, correlations, and regression modeling to tease out what factors influence a student’s response to an in-class activity, and this type of data analysis will combine courses together to try to accurately model student responses. Acknowledgements This project is funded by the U.S. National Science Foundation through grant numbers 1347417, 1347482, 1347580, 1347718, and 1500309. The opinions are those of the authors and do not necessarily represent the National Science Foundation. The authors would like to thank the instructors and students who agreed to be part of the pilot study, as well as project advisory board members.

References 1.

2.

3.

4.

5. 6. 7. 8.

9. 10. 11. 12.

13. 14.

15.

16.

17. 18. 19. 20. 21. 22.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410-8415. Finelli, C. J., Richardson, K. M., & Daly, S. R. (2013). Factors that influence faculty motivation to adopt effective teaching practices in engineering. Paper presented at the 2013 ASEE Annual Conference & Exposition, Atlanta, GA. Froyd, J. E., Borrego, M., Cutler, S., Henderson, C., & Prince, M. (2013). Estimates of use of researchbased instructional strategies in core electrical or computer engineering courses. IEEE Transactions on Education, 56(4), 393-399. Henderson, C., & Dancy, M. H. (2007). Barriers to the Use of Research-Based Instructional Strategies: The Influence of Both Individual and Situational Characteristics. Physical Review Special Topics: Physics Education Research, 3(2), 020102-020101 to 020102-020114. Bacon, D. R., Stewart, K. A., & Silver, W. S. (1999). Lessons from the best and worst student team experiences: How a teacher can make the difference. Journal of Management Education, 23(5), 467-488. Felder, R. M., & Brent, R. (2009). Active learning: An introduction. ASQ Higher Education Brief, 2(4), 15. Goodwin, L., Miller, J. E., & Cheetham, R. D. (1991). Teaching freshmen to think: does active learning work? BioScience, 719-722. Hall, S. R., Waitz, I., Brodeu, D. R., Soderholm, D. H., & Nasr, R. (2002). Adoption of active learning in a lecture-based engineering class. Paper presented at the Frontiers in Education, 2002. FIE 2002. 32nd Annual. Kvam, P. H. (2000). The effect of active learning methods on student retention in engineering statistics. The American Statistician, 54(2), 136-140. Rangachari, P. (1991). Design of a problem-based undergraduate course in pharmacology: implications for the teaching of physiology. The American journal of physiology, 260(6 Pt 3), S14-21. Wilke, R. R. (2003). The effect of active learning on student characteristics in a human physiology course for nonmajors. Advances in physiology education, 27(4), 207-223. Dancy, M. H., Henderson, C., Rebello, N. S., Engelhardt, P. V., & Singh, C. (2012). Experiences of new faculty implementing research-based instructional strategies. Paper presented at the AIP Conference Proceedings-American Institute of Physics. Dancy, M. H., Henderson, C., & Turpen, C. (accepted). How faculty learn about and implement research based instructional strategies: The case of Peer Instruction. Finelli, C. J., Richardson, K., & Daly, S. (2013). Factors that Influence Faculty Motivation of Effective Teaching Practices in Engineering. Paper presented at the Annual Conference-American Society for Engineering Education (ASEE). Froyd, J. E., Borrego, M., Cutler, S., Henderson, C., & Prince, M. (2013). Estimates of use of researchbased instructional strategies in core electrical or computer engineering courses. IEEE Transactions on Education, 56(4), 393-399. Henderson, C., & Dancy, M. H. (2007). Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics-Physics Education Research, 3(2), 020102. Prince, M., Borrego, M., Henderson, C., Cutler, S., & Froyd, J. (2013). Use of research-based instructional strategies in core chemical engineering courses. Chemical Engineering Education, 47(1), 27-37. Prince, M. (2004). Does active learning work? A review of the research. JOURNAL OF ENGINEERING EDUCATION-WASHINGTON-, 93, 223-232. Prince, M. J., & Felder, R. M. (2006). Inductive teaching and learning methods: Definitions, comparisons, and research bases. JOURNAL OF ENGINEERING EDUCATION-WASHINGTON-, 95(2), 123. Felder, R. M. (1992). How about a quick one? Chemical Engineering Education, 26(1), 18-19. Felder, R. M. (1994). Any questions? Chemical Engineering Education, 28(3), 174-175. Felder, R. M., & Brent, R. (1999). FAQs-2. Chemical Engineering Education, 33(4), 276-277.

23. Felder, R. M., & Brent, R. (2009). Active Learning: An Introduction. ASQ Higher Education Brief, 2(4). Retrieved from 24. Weimer, M. (2002). Learner-centered teaching: Five key changes to practice: John Wiley & Sons. 25. Lake, D. (2001). Student performance and perceptions of a lecture-based course compared with the same course utilizing group discussion. Physical Therapy, 81, 896-902. 26. Yadav, A., Lundeberg, M., Subedi, S., & Bunting, C. (2011). Problem-based learning in an undergraduate electrical engineering course. Journal of Engineering Education, 100(2), 253-280. 27. Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American journal of Physics, 66(1), 64-74. 28. Handelsman, J., Ebert-May, D., Beichner, R., Bruns, P., Chang, A., DeHaan, R., . . . Tilghman, S. M. (2004). Scientific teaching. Science, 304(5670), 521-522. 29. Meltzer, D. E., & Thornton, R. K. (2012). Resource letter ALIP–1: active-learning instruction in physics. American journal of Physics, 80(6), 478-496. 30. Gaffney, J. D., Gaffney, A. L. H., & Beichner, R. J. (2010). Do they see it coming? Using expectancy violation to gauge the success of pedagogical reforms. Physical Review Special Topics-Physics Education Research, 6(1), 010102. 31. Shekhar, P., Demonbrun, M., Borrego, M., Finelli, C., Prince, M., Henderson, C., & Waters, C. (2015). Development of an observation protocol to study undergraduate engineering student resistance to active learning. International Journal of Engineering Education, 31(2), 597-609.