Evaluating Blended And Flipped Instruction In Numerical Methods At .

Transcription

IJ-SoTL, Vol. 12 [2018], No. 1, Art. 11Evaluating Blended and Flipped Instruction in Numerical Methodsat Multiple Engineering SchoolsRenee Clark1, Autar Kaw2,Yingyan Lou3, Andrew Scott4, Mary Besterfield-Sacre5Department of Industrial Engineering, University of Pittsburgh, Pittsburgh, PA 15260, USADepartment of Mechanical Engineering, University of South Florida,Tampa, FL 33620, USA3School of Sustainable Engineering and the Built Environment, Arizona State University,Tempe, AZ 85281, USA4Department of Electrical Engineering, Alabama A&M University, Normal, AL 35762, USA5Department of Industrial Engineering, University of Pittsburgh, Pittsburgh, PA 15260, USA12(Received 18 January 2017; Accepted 3 October 2017)With the literature calling for comparisons among technology-enhanced or active-learning pedagogies, a blendedversus flipped instructional comparison was made for numerical methods coursework using three engineeringschools with diverse student demographics. This study contributes to needed comparisons of enhanced instructional approaches in STEM and presents a rigorous and adaptable methodology for doing so. Our flipped classroom consisted mostly of in-class active learning, with micro-lectures as needed, and technology used both in andout of class, including for expected pre-class review of new content. Our blended classroom consisted mostlyof lecture with some in-class active learning, and technology utilized both in and out of class. However, studentswere not expected to review new content before class. We compared blended vs. flipped instruction based uponmultiple-choice and free-response questions on the final exam as well as the perceived classroom environment.This was done for students as a whole as well as for under-represented minorities (URMs), females, community college transfers, and Pell Grant recipients. Students provided feedback via focus groups and surveys. Uponcombining data from the schools, the blended instruction was associated with slightly greater achievement onthe multiple-choice questions across various demographics, but the differences were not statistically significant,and the effects were small. Our free-response final exam and classroom environment data aligned, with blendedinstruction showing more promise at two schools. The students identified demanding expectations with flippedinstruction but pointed to benefits, such as enhanced learning or learning processes, preparation, and engagement.These results aligned with our focus group and instructor interview data. Thus, in general, it may be possible touse either instructional approach with the expectation of similar outcomes in final exam scores or the perceivedclassroom environment, keeping in mind the students qualitatively identified benefits with flipped instruction.Nonetheless, there were some large differences for the schools individually, suggesting further research withdifferent demographics.INTRODUCTIONIt can be difficult to engage students using traditional lecture; however, many educators have proposed (and research has shown)that engaged and involved students learn more and are betterprepared (Novak et al., 1999; Astin, 1985; Pascarella & Terenzini,2005; Kuh et al., 2005). Recently, educators have characterizedthe teaching of STEM courses using only traditional lecture asan ineffective and inferior approach (Mazur, 2009; Freeman et al.,2014; Wieman, 2014). In addition, educators have begun callingfor comparisons of active or enhanced learning methods, as opposed to using traditional lecture as the control or comparisongroup, given the advantages of active learning (Freeman et al.,2014; Wieman, 2014; Weimer, 2016 March 9).When students are passive during lecture, they retain less(Novak et al., 1999). A review of research in the 1990s showedthe most effective practices require student involvement andparticipation, although the authors cautioned against dismissinglecture completely (Pascarella & Terenzini, 2005). Other recentstudies have shown that active or interactive learners achieve significantly better (compared to passive learners) in problem-solving, time to mastery, conceptual understanding, and exam performance (Chi, 2009; Hake, 1998; Freeman et al., 2014). Othereducators have stressed that true learning occurs with “doing”and that classroom discussion leads to greater learning gains andengagement (Prince, 2004; Bonwell & Eison, 1991; Howard, lended learning can provide more engaging experiences byintegrating technology and/or replacing some aspects of faceto-face teaching with online learning, often maintaining a traditional class format (i.e., mostly lecture) nonetheless (Garrison &Vaughan, 2008; Bourne et. al, 2005; Dziuban et al., 2006). Theseonline experiences may include simulations, labs, tutorials, andassessments (Garrison & Vaughan, 2008). Technology use oftencreates an active environment (Carr et al., 2015). Blended learning has the objective of “using the web for what it does best, andusing class time for what it does best” (Osguthorpe & Graham,2003, 227). It represents the convergence of historically separatemodels – face-to-face and computer-supported models that accommodate interaction (Graham, 2006). The flipped classroom,however, uses class for active learning and interactions, with students watching lecture videos beforehand (Bergmann & Sams,2012). Students apply concepts during class, and instructorsserve as consultants (Velegol et al., 2015). Blended learning andthe flipped classroom are closely related to Just-in-Time-Teaching, which uses web resources for preparation and adjusts lectures to outcomes on pre-class assignments (Novak et al., 1999).Our blended classroom consisted mostly of lecture withsome group-based, in-class active learning, and technology utilized both in and out of class. This technology consisted of clickers, a continuously-available discussion board, and online quizzes,videos, and textbook content. Students were not expected toreview new content prior to class. Our flipped classroom con-1

Evaluating Blended and Flipped Instructionsisted mostly of in-class active learning with peer and instructorinteraction and micro-lectures as needed, with the same technology (as mentioned previously) used in and out of class. However, students in the flipped classroom were expected to reviewnew content before class via the videos or online readings. Ourflipped and blended classrooms therefore combined elements ofthe Connectivism, Cognitive Apprenticeship, and Social Development learning theories (Siemens, 2005; Collins et al., 1989; Vygotsky, 1978). Connectivism takes into account technology andnetworks and the connections they enable. In our classrooms,students had digital resources and the Piazza discussion boardthrough which they connected (Piazza, 2015). In a CognitiveApprenticeship, students learn skills through expert guidance,as in a skilled-trades apprenticeship (Collins et al., 1989). Thisscaffolding is possible in the flipped classroom as the instructorcirculates to assist with problem-solving.Vygotsky’s Social Development Theory highlights the social, interactive, and cooperativenature of learning, another feature of our active-learning classrooms (Vygotsky, 1978).In a preliminary study with one university, our researchshowed that the final exam results favored some degree offlipped instruction (either fully or semi-flipped), relative toblended instruction for numerical methods; however, trends inthe classroom environment favored the blended approach (Clarket al., 2016a). The classroom environment measurement included dimensions such as student cohesiveness, student participation in class, and student interaction with the instructor, amongothers. Further, the second author previously compared fourteaching methods, including blended and flipped, for one numerical methods topic (Kaw & Hess, 2007). Here, the flipped andblended methods had the highest final exam scores, respectively,although instructional value was rated highest by students forthe blended method. Our present research aims to add to thesefindings and increase generalizability using two additional diverseschools. Our research is one of the few such STEM studies weare aware of.An NSF grant enabled this research at three U.S. universitiesbetween 2014 and 2016 (Kaw et al., 2013). These universitiesdiffer, thereby adding to the generalizability. Based on the Carnegie Classification, all three are public. The University of SouthFlorida (USF) and Arizona State University (ASU) are classifiedas “highest research activity” doctoral universities, with about42,000 students at USF and 80,000 at ASU. Alabama A&M University (AAMU) is classified as a Master’s college/university andan HBCU (Historically Black College/University) with about5,000 students (Carnegie Classification, 2016). In investigatingblended versus flipped instruction, our research questions wereas follows:1. Are there achievement differences when using blended versus flipped instruction for numerical methodscoursework at various undergraduate institutions,and are differences evident for underrepresented minorities, females, community college transfers, and PellGrant recipients?2. Do students’ perceptions of the classroom environment differ when using blended versus flipped instruction for numerical methods coursework at various undergraduate institutions?3. What do students perceive as benefits and drawbacksof a numerical methods flipped 111Our goal was to develop recommended practices for teaching numerical methods and other STEM courses using active,technology-enhanced approaches to potentially optimize howSTEM is taught. In the following sections, we review the literatureon STEM blended and flipped classrooms.We discuss our coursedelivery, data collection, and statistical analysis methods, followedby a comparison of the final exam results for flipped versusblended instruction for all students and various demographicgroups.We provide a comparison of the methods in terms of theclassroom environment and present students’ perceptions of theflipped classroom.LITERATURE REVIEWBackground on Blended andFlipped InstructionBlended learning was featured in an instructional redesign program by the Pew Charitable Trusts (Twigg, 2003; Garrison & Kanuka, 2004). The program challenged higher education to redesign its instruction using technology, including computer-basedassessments, online discussion groups and learning communities,and online tutorials. Blended learning has been advocated orimplemented in the engineering disciplines represented in thisstudy (i.e., mechanical, civil and electrical engineering), in whichonline experiments, labs, simulations, and even entire programsfor non-traditional students have been implemented (Cortizo etal., 2010; Restivo et al., 2009; Henning et al., 2007; Hu & Zhang,2010; Dollár & Steif, 2009; Mendez & Gonzalez, 2010; Sell et al.,2012; Bohmer et al., 2013). Blended learning has also been implemented in courses that are foundational to numerical methods, including programming, using online automatic-feedbackself-practice tools (El-Zein et al., 2009).With flipped instruction, a recent survey of almost 1,100faculty members showed their top motivations for using flippedinstruction were to increase student engagement (79%) and improve learning (76%) (Bart, 2015). In another recent survey, 200instructors indicated they teach in a flipped mode because itincreases interaction with students, promotes flexibility, and increases student engagement (Herreid & Schiller, 2013). This is inagreement with other sources that describe flipped instructionas increasing interaction and collaboration (Bergmann & Sams,2012; Rosenberg, 2013). The flipped classroom has been implemented previously in a numerical methods course, in which itwas compared to traditional instruction (Bishop & Verleger, 2013;Bishop, 2013). However, since active learning is gaining recognition, studies that compare active or enhanced approaches, such asours, should be undertaken, as in a biology course recently (Jensen et al., 2015). The flipped classroom has been implemented inother courses for mechanical, electrical, and civil engineers, whocomprised our study. Mechanical engineering courses includeddesign, statics and mechanics, and electronics instrumentation(Dollár & Steif, 2009; Steif & Dollár, 2012; Cavalli et al., 2014;Connor et al., 2014; Papadopoulos & Roman, 2010). Electricalengineering courses included signal processing and electromagnetics (Van Veen, 2013; Furse, 2011). In civil engineering, flippedcourses in structural design and engineering economic analysishave been offered (Gross & Musselman, 2015; Lavelle et al., 2015).The flipped classroom has also been implemented in math andprogramming courses that serve as pre-requisites for numerical2

IJ-SoTL, Vol. 12 [2018], No. 1, Art. 11methods (McGivney-Burelle & Xue, 2013; Talbert, 2014; Love etal., 2014; Souza & Rodrigues, 2015; Lape et al., 2014).Blended and Flipped Classrooms:Results from the LiteratureIn comparisons of blended and traditional learning, blended learning has exhibited success. In the first round of the Pew redesignprojects, five of the ten projects reported improved outcomes,while four reported equivalent achievement (Twigg, 2003). Amultiple-semester comparison of face-to-face, fully online, andblended instruction showed blended to have the highest success(i.e., percent earning at least a grade of “C”) (Cavanagh, 2011).Comparisons of flipped and traditional instruction in mechanical, electrical, and civil engineering courses have shownmixed results, as has our study. For example, on a final staticsconcept assessment, the flipped sections scored statisticallyhigher than the traditional sections (Papadopoulos & Roman,2010). However, in a mechanics of materials course, there wasno significant difference on a common final between the flippedand traditional sections (Thomas & Philpot, 2012). Further, while82% in a traditional numerical methods course at North Dakotaearned a C or better, just 72% did so in the flipped section (Cavalli et al., 2014). In another numerical methods course, the testgains were statistically equivalent between flipped and traditionalsections (Bishop, 2013). Further examples of the mixed natureof comparisons of flipped and traditional instruction in electrical,civil, and foundational engineering (e.g., programming) coursescan be found in the literature (Van Veen, 2013; Furse, 2011; Gross& Musselman, 2015; Lavelle et al., 2015;Velegol et al., 2015; Souza& Rodrigues, 2015; Lape et al., 2014).Interestingly, in the recent 1,100-member faculty survey,only one-half (55%) saw evidence of improved learning (Bart,2015), which coincides with the mixed results discussed in theliterature.Student perceptions of the flipped classroom have likewisebeen mixed in the literature, as noted previously and as seen inour study (Bishop & Verleger, 2013). Only about half (54%) ofthe North Dakota students preferred the flipped format (Cavalliet al., 2014). Similarly, in a flipped electronics instrumentationcourse, only 56% had a preference for video versus traditional lectures (Connor et al., 2014). However, in the flipped signalprocessing course, fewer than 10% indicated a preference forthe traditional lecture by the end (Van Veen, 2013). In the structural design course, there has been increasing preference for theflipped format with each semester (Gross & Musselman, 2015).However, in the engineering economy course, survey results haveindicated an increasing dislike of the flipped structure over successive semesters (Lavelle et al., 2015).In contrast, students have generally had positive perceptionsof blended learning in engineering. In a course that used a remote experiment, the students rated “deeper learning of previous knowledge” at 5.6 and “e-learning contribution for betterlearning quality” at 5.7 on the seven-point scale (Restivo et al.,2009). With a remote lab in a microcontrollers/robotics course,students could repeat experiments anywhere and anytime (i.e.,81% agreed) and felt more at ease than in a classical experimental environment (i.e., 66% agreed) (Sell et al., 2012). In theintroductory programming course, satisfaction with the courserose 23% after implementation of the self-practice tool (El-Zeinet al., ETHODOLOGYData from eight sections of the numerical methods course,which were taught over a period of two years at three institutions, were collected. Four were flipped sections, and four wereblended sections. ASU and AAMU conducted one blended andone flipped section each over approximately a one-year period,and USF conducted two flipped and two blended sections overan approximately two-year period. Numerical methods courseis taken primarily by the following engineering disciplines ateach school: mechanical (USF), chemical and civil/environmental (ASU), and electrical/computer (AAMU). It covers numericalmethods for differentiation, nonlinear equations, simultaneouslinear equations, interpolation, regression, integration, and ordinary differential equations.To compare our methods, a comprehensive assessment planconsisting of direct and indirect measures was applied. We usedscores from common final exams to directly compare achievement for students as a whole as well as URMs, females, community college transfers, and Pell Grant recipients. The student’s GPA,based on self-reported grades from the pre-requisite courses,was used as a covariate, or control variable, in the analysis. Thefree-response questions differed among the schools due to thevarying majors, for in order to test higher-order skills, the instructors had to cater to physical applications within the discipline. The research team member serving as the assessmentanalyst also conducted pre and post-flip interviews with the instructors. In addition, the students were indirectly assessed fortheir perceptions of the benefits and drawbacks of flipped instruction using classroom environment and evaluation surveys,and focus groups. We will first discuss the methods used to develop and deliver the courses.Course Delivery Methods andStudent ParticipantsThe delivery of the course was kept very similar across the institutions. Table 1 provides a description of the implementationat USF; the implementations at ASU and AAMU were very similar, with any notable differences explained below. The blendedversion involved in-class lecture and clicker quizzes to assessconcepts. This coincides with the supplemental blended model,which retains the structure of the traditional class but adds technology (Twigg, 2003). After class, there were online auto-gradedquizzes, problem sets, and programming projects. The Piazza online discussion board was available continuously for quick feedback from the instructor, TA, and students.In the flipped version, students prepared before class withvideos or readings, online auto-graded quizzes, and an essay response about the most difficult or interesting concepts. At ASU,students also completed auto-graded, coding practice examplesusing MATLAB’s Cody Coursework before class. In addition, atall three schools, Piazza was used in the flipped classroom, andduring class, clickers and micro-lectures based on the pre-classquiz and essay were employed. Also, students worked on exercises or problems with their peers, and the instructor was availablefor support. After class, students took online, auto-graded quizzes and completed programming projects and possibly problemsets.3

Evaluating Blended and Flipped InstructionTable 1. Comparison of Blended & Flipped Delivery Methodsin this StudyActivityBlendedFlippedPre-classStudy pre-requisite material via videos for one-halfof the course topics.Study topic via textbook orvideo lectures.Continuous access to openContinuous access to courseware & Piazza discusopen courseware & Piaz- sion board.za discussion board.Automatically graded quiz(due 3 hours before class).Essay question on most difficult or interesting conceptfrom the topic (due 3 hoursbefore class).In-classClicker quiz in half ofclass sessions to gaugeconceptual understanding(no/low stakes). Fewerquestions presented vs. inthe flipped class.Mostly lecture with active learning components(e.g., two-way questioning, clickers, short exercises with peer interaction); some graded.Post-classAutomatically-gradedquizzes (due before nextclass).Problem set of 6 questions; not graded.Clicker quiz in every classsession to gauge conceptualunderstanding (no/low stakes).Micro-lectures based on preclass quiz and responses toessay question.Short exercises or outline-the-solutionproblemswith peer interaction and instructor help; some graded.Automatically-graded quizzes(due before next class).Problem set of 6 questions;not graded.Graded programming projectsGradedprogramming analyzing experimental data.projects analyzing experimental data.Some in-class exercises assigned as homework; someSome in-class exercises graded.assigned as homework;some graded.The videos were created during previous NSF-funded opencourseware development known as Holistic Numerical Methods(Kaw et al., 2012; Kaw & Yalcin, 2012; Owens et al., 2012; Kaw &Garapati, 2011; Kaw et al., 2004). The videos can be accessed athttp://mathforcollege.com/nm/videos/index.html (HNM, 2015).Finally, the instructors’ goals in flipping their courses are shownin Table 2.In total, there were 273 enrolled in the blended and 233enrolled in the flipped sections, for 506 total students between2014 and 2016. The percentages of enrolled students for whomwe had both final exam and demographic data to perform ouranalysis were as follows: for the flipped classes, 75%, 78%, and85% at USF, ASU, and AAMU, respectively; and for the blendedclasses, 73%, 93%, and 72%, respectively. In total, there were 215students in the blended classes and 180 students in the flippedclasses for whom we had both final exam and demographic datafor analysis. Our sample covered sophomores through seniors inmultiple engineering disciplines, with approximately 21% female.Additional demographic characteristics can be determined basedon the sample sizes in Table 9.https://doi.org/10.20429/ijsotl.2018.120111Table 2. Instructor Goals with FlippingUSFPromote higher-order Bloom’s skills, metacognitive skills,and responsibility for learningASUImprove learning, in particular, programming confidenceand skillsConduct hands-on activities with questions and answersin a low-stress environmentIntroduce in-class group work on formulation of problemsAAMUImprove programming-skills, with attention to detail andreal-world implementationIntroduce in-class project and hands-on workAssessment of LearningDirect assessment of learning based upon the final exam was usedto investigate our first research question comparing achievementwith blended versus flipped instruction.The final exam contained14 multiple-choice questions that were identical across theschools and instructional methods. The multiple-choice questions tested the lower-level skills in Bloom’s taxonomy (Wiggins & McTighe, 2005). In addition, there were four open-ended,free-response questions that remained the same from semesterto semester for each school, although they varied among theschools. These were intended to measure the higher-level skills.Using the multiple-choice and free-response results, wecompared the methods using an analysis of covariance (ANCOVA), with the pre-requisite GPA as the covariate or controlvariable. This was done for each school as well as the combineddata. We analyzed the data in a stratified fashion, comparing themethods for those demographic segments of interest. For example, we were interested in questions such as: “For females, whichmethod is associated with the best outcomes?” Given this granularity, the sample sizes were sometimes small, reducing power todetect statistically significant results (Ellis, 2010). Given the smallsamples for some of our comparisons, we also ran the non-parametric version of ANCOVA, known as Quade’s test (Quade,1967; Lawson, 1983). The p-values based on the parametric andnon-parametric analyses were generally in agreement, and examining both served to corroborate the results. Nonetheless, wedefaulted to the non-parametric result with small sample sizes.These analyses were conducted using SPSS 21. The pre-requisiteGPA was based on self-reported grades from calculus 1/2/3, ordinary differential equations, introductory programming, physics1, and/or linear algebra, depending on the school. The calculuscourses covered differential, integral, and 3D vector calculus aswell as series and sequences.Because of the large number of statistical tests for eachset of data, we applied Bonferroni’s correction (Perneger, 1998;Bland & Altman, 1995). When a large number of tests are conducted, some will, unfortunately, result in p 0.05 just by chance(McDonald, 2014). With Bonferroni’s correction, the α-level foreach individual test is set at 0.05/m, where m is the number oftests run. Alternatively, the observed p-value can be adjusted bymultiplying it by the number of tests run and comparing thisto α 0.05, as was done in this study. This correction has thedisadvantage that the interpretation of a result is dependent onthe number of other tests run (Perneger, 1998). We present thisinformation so the reader will be informed when interpretingour results. We also calculated effect sizes based on Cohen’s d(Sullivan & Feinn, 2012; Kotrlik et al., 2011). The effect size is a4

IJ-SoTL, Vol. 12 [2018], No. 1, Art. 11measure of practical or substantive significance. As discussed inthe articles above, the p-value and the effect size should both bereported in order to depict the complete picture. A prominentpublication manual also advises to include both the p-value andthe effect size (American Psychological Association, 2010). Weused Cohen’s thresholds to identify small, medium, and large effect sizes, respectively: d 0.20, d 0.50, and d 0.80 (Cohen, 1987;Salkind, 2010). For adjusted means, we calculated adjusted effectsizes (Huck, 2012). SPSS adjusts the means using the mean of thecovariate (Norusis, 2005).To directly assess achievement in a stratified manner, wedeveloped a demographics survey, to be used in conjunction withthe final exam. It consisted of questions regarding gender, race/ethnicity, Pell grant status, transfer status, and grades in pre-requisite courses, which were used to calculate a pre-requisite GPAto be used as a control variable.The students were asked to provide a personal code when completing this survey, which allowedus to match the student’s final exam performance with his/herdemographic characteristics. The demographic segments of particular interest within our research were the following:1.2.3.4.Underrepresented minority (URM): {yes, no}Pell Grant recipient: {yes, no}Transfer status: {admitted to engineering as freshmen,transferred to engineering from a community collegewith an Associate’s degree, other transfer students}Gender: {male, female}The underrepresented minority students consisted of Hispanic, American Indian, Black/African American, or Hawaiian/Pacific Islander students.The “other” transfer students consisted ofinternal transfers to the engineering school, community collegetransfers without Associates’ degrees, and transfers from external four-year programs. The Pell Grant Program provides needbased grants to low-income undergraduates (Federal Pell GrantProgram, 2015).Classroom Environment SurveyWe used the College and University Classroom EnvironmentInventory (CUCEI) to investigate our second research questionabout perceptions of the learning environment with blendedversus flipped instruction (Fraser & Treagust, 1986). This reliable inventory evaluates seven psychosocial dimensions of theclassroom, as shown in Table 3, and has been used previously inflipped classroom research (Strayer, 2012; Clark et al., 2014a).Several of the dimensions are typical goals of the flipped classroom, including student cohesiveness, individualization, innovation, involvement, and personalization. There are seven questionsper dimension on a 1 to 5 scale, with 5 being most desirable. Anaverage score for the dimension was calculated for each student,which was used to test for differences by dimension. Specifically,we ran an independent samples t-test for each dimension. In thecase of one school, the sample sizes were small, so we ran thenon-parametric Mann-Whitney test also (Norusis, 2005).We distributed the CUCEI during the last week of class and collectedthe data anonymously to enable the most comprehensive andhonest 0111Table 3. CUCEI DimensionsDimensionDefinitionStudent CohesivenessStudents know & help one anotherIndividualizationStudents can make decisions; treated individually or differentiallyInnovationNew or unusual class activities or techniquesInvolvementStudents participate actively in classPersonalizationStudent interaction w/ instructorSatisfactionEnjoyment of classesTask OrientationOrganization of class activitiesFlipped Classroom Evaluation Survey andStudent Focus GroupsA flipped classroom evaluation survey and student focus groupswere used to investigate our third research question about thebenefits and drawbacks of flipped instruction. We employedmany of the questions used by Zappe, Leicht and colleagues, whoused perception surveys in a flipped engineering course (Zappeet al., 2009; Leicht et al., 2012). In addition, we expanded upontheir questions given our specific interests. A complete copy ofour survey can be found in an earlier publication (Clark et al.,2016a). As with the CUCEI, we distributed the evaluation surveyduring the last week of class and collected the data anonymously.We also asked two open-ended questions on

some group-based, in-class active learning, and technology uti-lized both in and out of class. This technology consisted of click-ers, a continuously-available discussion board, and online quizzes, videos, and textbook content. Students were not expected to review new content prior to class. Our flipped classroom con-1