An Analysis Of The Relation Between Student Usage And Course Outcomes .

Transcription

An analysis of the relation betweenstudent usage and course outcomesfor MyLabMath and MyLab Foundational SkillsPearson Global Product OrganizationEfficacy & ResearchImpact Evaluation

Executive summary3Product descriptions3Intended outcomes3Research focus and research questions4Key findings4Recommendations8Next steps9Introduction11Overview of foundational research11Description of courseware products14Intended outcomes14The present study15Method15DUA and data transfer procedures16Data linking procedures16The Pearson usage data17Analytical Samples19Results20Analysis of courseware usage and performance20Predicting course outcomes based on courseware usage and performance22Conclusion29Courseware usage and performance trends by institution29Discussion30References34Appendix A: Data Audit Findings35Appendix B: data processing details38Data files38Data cleaning38Data merging39Appendix C: model descriptives of courseware usage and performance42Appendix D: Correlation and Variance Inflation Factors44Appendix E: logistic regression accuracy tables49Appendix F: full model tables50

Executive summaryProduct descriptionsMyLab Math 1(2014-15) is an online tutorial and assessment tool for teaching and learning mathematics. It isdesigned to provide engaging experiences and personalized learning for each student so that all students cansucceed. The homework, quizzes and tests include immediate feedback when students enter answers, whichresearch indicates strengthens the learning process (Bangert--Drowns, Kulik, Kulik, & Morgan,1991; Hattie,2009; Hattie & Timperley, 2007; Sadler, 1989). MyLab Math automatically tracks students' results andincludes item analysis to track class-wide progress on specific learning objectives. QuizMe was also part ofthe MyLab Math Study Plan, which for some students provided a faster path through the course. For example,if students passed the QuizMe knowledge checks, they could skip some of the Study Plan’s practice exercises.MyLab Foundational Skills 2(2014-15) is an online mastery and competency-based resource. It is used forassessing and remediating college and career readiness in reading, writing, mathematics, study skills, anddigital literacy. MyLab Foundational Skills does this by first applying a diagnostic assessment to identifystudents’ strengths and weaknesses. By engaging in homework, quizzes and tests, students are able to masterskills at their own level, working at their own pace.The versions of MyLab Math and MyLab Foundational Skills analyzed in this study also had adaptive learningresources provided by an outside vendor, which could be activated to support personalized learning; however,as the outside vendor’s usage data was not interpretable for this study, the efficacy of these adaptive featuresof MyLab Math and MyLab Foundational Skills was not analyzed.Intended outcomesMyLab Math: one of the greatest challenges that colleges in the United States face is that many students enterunprepared to complete college level mathematics courses. Most colleges have a sequence of developmentalmathematics courses that start with basic arithmetic and then go on to pre-algebra, elementary algebra, andfinally intermediate algebra, all of which a student must complete and pass before enrolling in a credit-bearingmathematics course. MyLab Math is designed to provide students with a positive, personalized learningexperience that will help them develop a beneficial mind-set in math, so that they can achieve the prerequisitemath skills that will enable them to successfully complete credit-bearing mathematics courses.MyLab Foundational Skills: each course offers comprehensive content, including assessment, instruction,practice and post-assessment. These may be used as is, or customized to the specific objectives of a program.MyLab Foundational Skills uses diagnostic assessment to generate a personalized learning path that supportscurriculum and skills mastery. The adaptive learning path allows students to learn at a level and pace that isaligned with their individual needs, with the ultimate goal of an improved learning experience and higherachievement for better overall outcomes.1When this study was carried out, MyLab Math was known as MyMathLab. For consistency, we refer to the product by itscurrent name throughout this report.2When this study was carried out, MyLab Foundational Skills was known as MyLabFoundational Skills (no space). Forconsistency, we refer to the product by its current name throughout this report.

Research focus and research questionsThis study aimed to deepen understanding of the relation between courseware activity usage and courseoutcomes. Such research has the potential to provide educators and courseware developers better evidence onhow to optimize implementation of courseware technologies. This technical report presents findings from ananalysis of courseware usage data from these two Pearson products, MyLab Math and MyLab FoundationalSkills, in two contrasting college settings (a 4-year and 2-year institution) and two different subject areas(mathematics and English language arts).To develop this report, researchers from SRI International leveraged course outcome data collected from 2014through 2015 from two college campuses that participated in the Adaptive Learning Market AccelerationProject (ALMAP), sponsored by the Bill & Melinda Gates Foundation.They also gathered new Pearson data, which included metrics of courseware usage (e.g., hours per task andattempts per task) and performance (e.g., scores per task, and learning objectives attempted and mastered).These metrics were gathered for homework, test and quiz courseware activities. In the case of MyLab Mathonly, the QuizMe activity data was also collected. QuizMe activities are knowledge checkpoints that, ifpassed, permit students to skip assigned practice exercises in their Study Plans. Mastery of objectives inMyLab Math was based on aggregated data from the Study Plan, specifically both the practice and QuizMeactivities. Mastery of objectives in MyLab Foundational Skills was based on aggregated data from homework,quizzes, and tests at a percentage (e.g., 70% correct or 80% correct, etc.) set by the instructor.For both institutions and both Pearson products, this report presents descriptive statistics and inferentialstatistical models that used the courseware usage and performance data to predict course grades and coursecompletion (i.e., passing a course). The models controlled for student background characteristics commonlyused in education research, including gender, ethnicity, Pell status, enrollment status (full time or part time),and measures of student prior achievement or a proxy, when available. Since analyses revealed a highcorrelation among the courseware activity variables, analysts consulted with Pearson to select the variables ofinterest to include in the models reported here.The study addressed the following research questions:1. What were the trends in the students’ use of and performance in the courseware?2. Controlling for student demographic and prior achievement variables, is student courseware use and performanceassociated with course outcomes?Key findingsThe results of the present study were as follows:Courseware usage and performance trends by institution During each of each of two 17-week academic periods at Arizona State University (ASU), each studentspent an average total of 32 hours in the MyLab Math courseware. Most activity was spent in QuizMe,the automated quiz activity in the MyLab Math Study Plan. QuizMe checks knowledge either (1) afterstudents engage in practice activities, or (2) before students engage in practice, to documentcompetency and permit skipping redundant practice activities. SRI did not have any usage data fromStudy Plan practice activities, which are distinct from homework activities. Nearly two-thirds of ASUstudents did not attempt homework activities in MyLab Math. On average, ASU students made about80 attempts over the full course in QuizMe. Across the three primary activity types for which SRI hadusage and performance data for ASU students — quizzes, QuizMe quizzes, and tests — the average

performance score was 69%. On average, students attempted 52 learning objectives and mastered 51 ofthem, based on data from the Study Plan’s QuizMe quizzes. Over each of Rio Salado’s three 13-week academic periods, each student spent an average total of 18.8hours in MyLab Foundational Skills courseware. On average, Rio Salado students made about 103.6attempts at homework assignments over the full course. On average, Rio Salado students made about32.89 test attempts per academic term. The average Rio Salado student’s score across the three primaryactivity types — homework, quizzes, and tests — was 91%. On average, students attempted 229learning objectives, but mastered only about 124 of them, based on data aggregated across homework,quizzes, and tests.Predicting course outcomes from usage dataControlling for the selected student background characteristics, several courseware usage and performancevariables significantly predicted the two course outcomes at each institution: course grades and completion. Inexploring the usage data, however, SRI discovered many high correlations and multicollinearity issues thatprevented full use of all activity types and usage metrics (e.g., hours and attempts) in our predictive models(for details, see Appendix D). For ASU, we were unable to include both hours and attempts because ofmulticollinearity issues. In consultation with Pearson, we chose to use attempts as our preferred usagevariable. Also, in the case of ASU, we included all activity types except homework because too few studentsused those activities in the courseware. For Rio Salado, the activity types were so highly correlated that wewere able to include only one activity type. In consultation with Pearson, we chose to use homework as ourpreferred activity variable.MyLab Math at ASU: Course grades: the model showed that three usage and performance trends — increased attempts onquizzes and tests, increased average scores on quizzes and tests, and increased learning objectivesmastered — were associated with significantly higher course grades, controlling for the selectedstudent-level background characteristics. However, having a greater number of MyLab Math QuizMeattempts and achieving better scores in QuizMe quizzes in the Study Plan were associated withstatistically significant lower course grades, when controlling for student-level backgroundcharacteristics. (See Table 1 for a visual summary.) Course completion (e.g., passing the course): the models showed that three usage and performancetrends — increased attempts in quizzes and tests, increased average scores on quizzes and tests, andincreased learning objectives mastered — were associated with increased likelihood of studentscompleting a course, when controlling for the selected student-level background characteristics. Inaddition, female students who made more test attempts, scored higher on the tests, or mastered moreobjectives were more likely to pass their courses than males. Male students who made more quizattempts were more likely to pass their courses than females. Finally, one negative association wasfound between performance in the courseware and course completion: increases in average QuizMescores were negatively associated with completing the course. (See Table 2 for a visual summary.)

Table 1: relation of MyLab Math activity attempts, scores, and learning objectives mastered to ASU coursegradesStudent usage/performancevariableType of assignmentQuizQuizMeTestNumber of attemptsScoreNumber of objectives masteredPositive association: higher values for factor linked significantly with higher course grades.Negative association: higher values for factor linked significantly with lower course grades.No significant association: factor unrelated to course grade.Table 2: relation of MyLab Math activity attempts, scores, and learning objectives mastered to ASU Coursecompletion (passing)Student usage/performancevariableType of assignmentQuizQuiz MeTestNumber of attemptsScoreNumber of objectives masteredPositive association: higher values for factor linked significantly with higher probability of passing the course.Negative association: higher values for factor linked significantly with lower probability of passing the course.No significant association: factor unrelated to probability of passing course.MyLab Foundational Skills at Rio Salado: Course grades: both a higher number of homework attempts and a higher number of learningobjectives mastered, based on aggregate data from homework, quizzes and tests, were associated withstatistically significant higher course grades, when controlling for the selected student-levelbackground characteristics. There was a negative association between attempting learning objectivesand course grades (Table 3). Course completion: the model results with course completion as the outcome variable mirror those forcourse grades. Making more Homework attempts and mastering more learning objectives within thecourseware were associated with a higher likelihood of completing courses, after controlling for the

selected student-level background characteristics. However, we also found that making more attemptsto master learning objectives was associated with a lower likelihood of completing the course (Table4).Table 3: relation of MyLab Foundational Skills homework time spent, attempts, scores, and learningobjectives attempted/mastered to Rio Salado course gradesStudent usage/performancevariableType of assignmentHomeworkTime spentNumber of attemptsScoreNumber of objectives attemptedNumber of objectives masteredPositive association: higher values for factor linked significantly with higher course grades.Negative association: higher values for factor linked significantly with lower course grades.No significant association: factor unrelated to course grade.

Table 4: relation of MyLab Foundational Skills homework time spent, attempts, scores, and learningObjectives attempted/mastered to Rio Salado course completion (passing)Student usage/performancevariableType of assignmentHomeworkTime spentNumber of attemptsScoreNumber of objectives attemptedNumber of objectives masteredPositive association: higher values for factor linked significantly with higher probability of passing the course.Negative association: higher values for factor linked significantly with lower probability of passing the course.No significant association: factor unrelated to probability of passing course.RecommendationsWe provide separate recommendations for each courseware product.Consistent with past studies of MyLab Math, the findings suggest that quiz and test scores in the coursewareare related to higher grades in a college-level algebra class and a higher probability of passing the course. Thestudy also found that most ASU students were not engaged with the courseware’s homework assignments, butinstead using the Study Plan tool that features practice and QuizMe quizzes. The data indicates that students,on average, achieved mastery of nearly all the courseware learning objectives based on data from QuizMequizzes, which was a trend associated with positive course grades and passing the course.The results show negative relations for both QuizMe attempts and QuizMe scores with course grades. Withoutmore detailed usage data focused on behaviors associated with productive persistence, this outcome cannot beinterpreted definitively. However, we speculate that this outcome likely stems from student efforts to “game”the Pearson courseware and not engage in productively persistent learning activity. For example, a highaverage number of QuizMe attempts and low course outcomes is consistent with students who skip practiceactivities, and instead repeatedly take guesses at QuizMe quizzes until they achieve a passing score. Thisbehavior drives up the number of QuizMe attempts. However, students have not necessarily learned thematerial, which shows in lower course grades. On the other hand, obtaining higher QuizMe scores and lowcourse outcomes may stem from two types of behavior: one consistent with gaming the system and oneconsistent with productive persistence. Students attempting to game the system may pursue a limited numberof learning objectives and achieve high average scores on those few QuizMe quizzes, but have not coveredenough material to do well in the course. However, more persistent students may attempt a higher number of

learning objectives and achieve a lower average QuizMe score, but have covered sufficient material to do wellin the course.With respect to MyLab Foundational Skills, the findings suggest that more homework practice in thecourseware and more courseware learning objectives mastered are both associated with higher grades in twodevelopmental writing courses and a higher probability of passing these courses. However, the study alsofound that Rio Salado students attempted nearly twice as many learning objectives as they mastered. Withoutmore data on how the courseware was implemented in these online classes, we cannot interpret these findingsdefinitively.Next stepsThere were several limitations to this study. In the models, we attempted to control for any bias that could beintroduced by students’ background characteristics and prior skill level by including measures of thosecharacteristics common to educational research (e.g., gender, ethnicity, Pell grant status, full- or part-timeenrollment status) and incoming skill level. Despite these controls, these measures probably did not capture allthe possible confounding factors that might influence use and course outcomes, such as student motivation,family support, and prior learning experiences with technology. As a result, while results of these analyses canhelp indicate whether a relation between use and learning outcomes exists, they cannot be used to establishwith certainty whether product use caused better student learning outcomes. There are multiple plausibleexplanations for any of the reported associations. Thus, the findings associated with these analyses should betreated as exploratory and positive associations as promising, but not definitive evidence of a causalconnection between product use and improved learning and skill development.In addition, the samples at each campus in this study were smaller than those in the original ALMAP studybecause, for these analyses, researchers needed to match students in the ALMAP sample with their Pearsoncourseware usage data. For a variety of reasons explained in detail in the report, we could not match data inmany cases in the two data sets. Thus, the original ASU sample of 2,475 was reduced to an analytical sampleof 1,570, and the original Rio Salado sample of 964 was reduced to 327 students. The resulting studentsamples varied demographically across the two institutions. ASU students were evenly split between men(46%) and women (54%), were mostly White and Asian (61%), were full-time students (95%), and less than athird relied on Pell grant financial aid. In contrast, Rio Salado students were mostly women (63%), were morerepresentative of diverse races/ethnicities (48% White or Asian; 43% other populations), were enrolled mostlypart time (73%), and more than half relied on federal Pell grant assistance.Other limitations were that not all instructors participated in the ALMAP surveys, and those surveys did notfocus specifically on elements of the MyLab Pearson products. However, some of those instructor surveyitems did shed light on specific courseware implementation challenges. For example, ASU instructors notedthat students “rushed through” the courseware content and focused on “getting the points”, rather than deeplylearning. The Rio Salado instructors said that they had difficulty importing grades from MyLab FoundationalSkills into their online grading system, inserting customized writing assignments into the courseware, andproviding feedback to students. They also described their students as not being “savvy” to the system andfailing to find required writing assignments in it. All faculty respondents at both campuses noted that theycould track individual and class progress using the two courseware products.Overall, the findings indicate that future studies exploring MyLab courseware usage data would be enhancedby collection of class implementation details about (1) the specific MyLab courseware activities thatinstructors assign, (2) their methods of integrating the courseware scores into class grading systems, and (3)the assumptions that both students and instructors make about how to use the MyLab courseware to support

learning. Of particular interest is building an understanding of how instructors guide students to engage in thecourseware activities, specifically homework, quizzes, and tests, and, in the case of MyLab Math,understanding the trade-offs of replacing engagement in these three activities with a Study Plan thatemphasizes practice activities and QuizMe quizzes. Future studies also should include usage data from allfeatures of the system, including practice activities in the Study Plan.This study provides some further information on how to control for variations in students’ baselineknowledge. In a past internal MyLab Math study (Pearson Education, 2016), analysts used prior term gradesas a baseline knowledge measure with a subset of students. Using this as the prior achievement variable,Pearson analysts found that the number of courseware learning objectives mastered failed to predict passing acourse. However, when the current study used college entrance examination scores as a baseline knowledgemeasure (e.g., as part of the ALMAP study), it found that the number of learning objectives mastered inMyLab Math courseware not only predicted passing the course, but also predicted higher course grades. Thesecontrasting results raise questions about the analyses that use prior GPA as opposed to standardized test scoresas proxies for prior achievement. A third option for establishing prior knowledge used in the ALMAP studywas found to be most precise: using an assessment of prior knowledge on the academic content relevant to aparticular course. The study also provides some support for the theory advanced in the prior internal MyLabMath report that homework and quizzes can help students master the course material. In the current study ofMyLab Math, higher attempts (e.g., more practice) with quiz items (and test items) significantly predictedhigher course grades and passing the course.For MyLab Foundational Skills, this study indicated a negative relation between learning objectivesattempted, and both course completion and grades. Further, there was the wide gap between the number oflearning objectives attempted by the Rio students and those they mastered within the courseware. We alsoshould note that Rio students were receiving Study Plan guidance from the outside vendor’s adaptivealgorithm, but how and when they were using those recommendations was not interpretable from thealgorithm data available for this analysis. Without more algorithm usage data and classroom implementationdata — such as how instructors incorporated courseware scores toward course grades, or how students wereresponding to recommendations to pursue specific learning objectives — it is difficult to interpret thesefindings. It is unclear in the case of MyLab Foundational Skills whether students were exploring extra learningobjectives out of curiosity or because they failed to understand how to navigate through the courseware andhow to respond to the outside vendor’s adaptive algorithm recommendations.One high-level take-away from this study is that practicing with content until one gets individual homeworkproblems, quiz items, and test items correct appears to lead to positive course outcomes. The study also raisesquestions about the value of alternative uses of the courseware, such as attempting a larger number of learningobjectives than one intends to master (in the case of Rio Salado) or relying on the Study Plan’s QuizMequizzes without engaging in practice activities (in the case of ASU). These alternative practices did not appearto yield positive impacts on course performance. However, we cannot say for certain whether either of theseconclusions are accurate without usage data from the Study Plan’s practice activities and more informationabout how, and whether, faculty members integrated the courseware scores for learning objectives masteredinto their course grades. Also, the study could not determine to what extent these alternative practices occurredas students responded to recommendations from an outside vendor’s adaptive algorithms, as the vendor’susage data was insufficient for interpretation.

IntroductionTo support student success, U.S. institutions of higher education are increasingly using coursewaretechnologies to help students study. Such technologies include electronic textbooks that feature automaticallygraded homework assignments, quizzes, and practice tests. Designers of interactive electronic textbooksintend to engage students in these activities to help them achieve content mastery. However, initial researchindicates that both students and faculty members engage in different degrees of courseware usage and differentmethods of integrating courseware scores into class gradebooks, which leads to different impacts on studentcourse outcomes.Understanding the relationship between courseware activity usage and course outcomes has the potential toprovide educators and courseware developers better evidence on how to optimize implementation ofcourseware technologies. However, few public reports have drawn on the expanding data trove from studentswho are using these courseware products at college campuses throughout the United States. To address thisresearch gap, this technical report offers findings from an analysis of courseware usage data from two differentPearson products: MyLab Math and MyLab Foundational Skills. This report presents results from statisticalmodels that used courseware usage and performance data to predict course grades and course completion intwo institutions of higher education.To develop this report, researchers from SRI International leveraged course outcome data collected from 2014through 2015 from two college campuses that participated in the Adaptive Learning Market AccelerationProject (ALMAP), sponsored by the Bill & Melinda Gates Foundation. The ALMAP study focused onadaptive learning courseware — a specific kind of online product that uses computer algorithms to parselearning analytic data so as to guide students as they study. The ALMAP study aggregated findings fromadaptive courseware evaluations conducted by 14 higher education institutions. It provided an initial review ofthe relative efficacy of nine adaptive courseware products as they were integrated into 23 developmental andgeneral education courses over two to three academic terms. ALMAP researchers gathered quasi-experimentalevidence on course outcomes, cost data, and both instructors’ and students’ experiences of the courseware(Yarnall, Means, and Wetzel, 2017). However, one notable gap in the original ALMAP study was the lack ofaccess to, and analysis of, the courseware-generated data on student product usage and performance.In 2016, to deepen understanding of how its own products were used in the ALMAP courses, PearsonEducation hired SRI International, the research institute that conducted the original ALMAP study, to examinehow student usage of and performance in MyLab Math and MyLab Foundational Skills related to courseoutcomes.Overview of foundational researchThis section summarizes the education research that informed the design of each of the two productsdiscussed in this report.MyLab Foundational Skills is an instructional program based on providing students with a learnercentered environment that builds and supports developmental progression through the course. Theversion of MyLab Foundational Skills studied in this report featured personalized and adaptive learningpaths provided through the services of an outside vendor.The design of MyLab Foundational Skills is aligned with several areas of education research in thelearning sciences — diverse, transdisciplinary fields that seek to understand how humans learn. Using

insights distilled from the learning sciences, a number of learning design principles have been developedthat guide the creation of our products. MyLab Foundational Skills demonstrates a number of theselearning design principles, as follows.AdaptivitySuccessful instruction must help students quickly establish a foundation of knowledge and skills, as wellas provide opportunities and support for developing more advanced mastery levels. Adaptive learningtechnologies, such as MyLab Foundational Skills, are one promising approach that research has exploredto address this. As students gain proficiency, the learning opportunities can transition from being highlyscaffolded and knowledge focused, to more open ended and focused on conceptual understanding andadaptation of knowledge, following research on the “expertise-reversal effect” (Kalyuga, Ayres,Chandler, & Sweller, 2003). The adaptive functionality of MyLab Foundational Skills provides specificand immediate feedback, so students can build confidence and proficiency in their skills. Subsequentitems are then selected based on students’ performances on previous items.Scaffolding and fadingResearch has found that novices learn and process information in fundamentally different ways thanthose with more background knowledge (Chi, Feltovich, & Glaser, 1981). Specifically, novicesrequire more support because they do not have a body of relevant knowledge an

Over each of Rio Salado's three 13-week academic periods, each student spent an average total of 18.8 hours in MyLab Foundational Skills courseware. On average, Rio Salado students made about 103.6 attempts at homework assignments over the full course. On average, Rio Salado students made about 32.89 test attempts per academic term.