DOCUMENT RESUME ED 390 919 TM 024 326 AUTHOR Gearhart, Maryl . - ERIC

Transcription

DOCUMENT RESUMETM 024 326ED 390 919AUTHORTITLEINSTITUTIONSPONS AGENCYGearhart, Maryl; Herman, Joan L.Portfolio Assessment: Whose Work Is It? Issues in theUse of Classroom Assignments for Accountability.Evaluation Comment.California Univ., Los Angeles. Center for the Studyof Evaluation.; National Center for Research onEvaluation, Standards, and Student Testing, LosAngeles, CA.Office of Educational Research and Improvement (ED),Washington, DC.PUB DATECONTRACTNOTEPUB TYPE95EDRS PRICEDESCRIPTORSMFOI/PCO2 Plus Postage.*Accountability; Cooperation; Educational Assessment;Educational Change; Inferences; *Performance BasedAssessment; *Portfolio Assessment; Portfolios(Background Materials); State Programs; *StudentEvaluation; Teacher Student Relationship; TeachingMethods; *Testing Programs; Test Use; *Validity*Large Scale Programs; Feasibility (142)ABSTRACTThis article examines one of the challenges to theintegration of classroom and large-scale portfolio assessment, achallenge posed by the use of a student's classronm portfolio forlarge-scale assessment of his or her individual competencies. Whenwork is composed with the support of peers, teachers, and parents,whose work is being judged? The validity of inferences drawn from theassessment can be compromised unless the question can be answered.Experience with portfolio assessment in Vermont and in another studyconducted by the Center for Research on Evaluation, Standards, andStudent Testing (California) suggest that the quality of student workreflects not only a student's competence, but also the amount andquality of support received from others. Procedures that highlight astudent's contribution to the work must be developed, complicatedthough this will be. Nevertheless, large-scale portfolio assessmentprograms appear to carry significant benefits for instructionalreform. (Contains 5 tables and 73 references.) ***************************Reproductions supplied by EDRS are the best that can be madefrom the original ( *****************************

U.S. DEPARTMENT OF EDUCATIONand ImplovemIntOf Ice of Educabonai ResesochEDUC TIONAL RESOURCES INFORMATIONCENTER ;ERIC)SSION TO REPRODUCE THISMATERIAL HA.1 BEEN GRANTED BYAg6-Et--11401 00CurhOhl has tele (*produced asreclhvolo0 1101h III 00rSOn Or OrgilhallohOroofignafgc It10 ompf0v0 M.001 changes hens been modereor0OuCh0n ClualotydoctrthisOonts of wow of opm.onsstatAdnmpleeent othcalmoat do h01 hCSsisnlyOERI pos,40" of 001.cvTO THE EDUCATIONAL RESOURCESINFORMATION CENTER (ERIC).-PORTFOLIO ASSESSMENT: WHOSE WORK IS IT?ISSUES IN THE USE OF CLASSROOM ASSIGNMENTS FOR ACCOUNTABILITYM aryl Gearhart & Joan L. HermanCenter for the Study of EvaluationNational Center for Research on Evaluation, Standards, & Student TestingUniversity of California, Los Angeles

tEVALUATION COMMENTWinter 1995Portfolio Assessment: Whose Work Is It?Issues in the Use of Classroom Assignments for AccountabilityMarvl Gearhart and Joan L. HermaniCenter for the Study of EvaluationNational Center tbr Research on Evaluation, Standardsmd Student TestingUniversity of California, Los AngelesT0 many engaged in educational reform,portfolio assessment captures a vision ofassessment integrated with instruction.Concerned about the equity and validity of largescale assessment, portfolio advocates argue thatstudents' classroom work and their reflections onNew & ReeentCRESST TechnicalReports & Productsthat work provide a richer and truer picture ofSee pages .17-23 l'r listings oinew- orreeently relestsed-rclrts and productsstudents' competencies than do traditional or otheron-demand assessments. Concerned about theimpact of testing on teaching, advocates point outli-Qat a yarie-tyreseal-cite-us.thatis displays of the products of instruction,porttblios challenge teachers and student to focuson meaningful outcomes. Furthermore, porttblioassessment practices support the assessment oflongterm pnijects over time, encourage student -initiatedrevision, and provide a context tbr presentation,guidancemd critique. Given such an ambitiousagenda thr assessment, instruction and account ability, it i., no surprise that what is meant bY "porttblioor -porttiilio JSSCSSMC111" varies markedly in practice and purpose .2Shared by most large-scale assessment projects,however, is a commitment to bridge the worlds ofpublic accountabihtv and classroom p.actice. Thegoal is to give students, teachers, and pc hey makersauthentic roles in the assessment of stu-ients at alllevels of an accountability system .jid to providedata that are appropriate and useful at each les cl.The portfolio spans one leYel of decision making tothe next, providing detailed evidence at the dass-'cN13

2Portfolio Assessment: Whose Work Is It?Issues in the Use of Classroom Assignments for Accountabilityroom level of the process and outcomes of studentperformance to guide instruction and learning, andeforts to provide evidence of validity. In both ofthese studies, patterns of relationships among on-then supporting more abridged inferences at thelarge-scale level about the quality of performanceand schooling. Integrated with instruction anddemand assessments and portfolio assessments raisequestions about the validity of test scores.targeted on high standards for student performance, the portfolio is the bridge that supportsreform of classroom practices on the one side andaccountability on the other. The vision is enticing,but will it work? Can classroom work be utilized forlarge-scale, high-stakes assessment?Koretz and his RAND colleagues have beenevaluating Vermont's statewide portfolio assessment program since 1990.5 The Vermont program targets writing and mathematics at Grades4 and 8 and includes three components for eachsubject area: year-long student portfolios, "bestpieces" drawn from the portfolios, and stateIn this article, we examine one of the challengesto the integration of classroom and large-scaleportfolio assessment, a challenge posed by theuse of a student's classroom portfolio for large-scalesponsored "uniform" tests which arc standardized but not necessarily multiple-choice. Pat-assessment of his or her individual competencies.3When raters working outside the classroom con-subjects were problematic (Koretz, Klein,text4 ao asked to make judgments about an individualstudent based on a portfolio of work composed withtilt support of peers, teachers, and parents, whosework is being judged? We argue that certain answersto this luestion could threaten the validity of inferences that can be drawn about individual performancefrom portfolios constructed in the social complexi-ties of classroom life. Thus the investigation ofpossible answers to the "whose work" questionbecomes an essential component to thc study of thevalidity of portfolio assessment.patterns of relatioaships amongon-demand assessments and portfolio assessments raise questions aboutthe validity of test scores.terns of relationships between the results ofportfolio assessment and uniform tests in bothMcCaffrey, & Stecher, 1993). While recognizing that portfolios and standard assessment maywell emphasize different aspects of a subjectdomain, the researchers expected correlationsbetween the two types of assessments within asubject to be stronger than those across subjectareas. Instead, they found essentially the samelevel of correlation within and across subjectareas: For example, in writing, writing portfolioscores correlated moderately with the standardmeasure of writing and with the portfolio andstandard measures of mathematics.Gearhart and Herman have conducted two technical studies of the ratability of classroom writing portfolios. In an initial study, the researchersfound no relationship between scores for writingportfolios and for standard writing assessments:Two-thirds of the students classified as compe-tent based on the porttblio score were not soConcerns regarding the validity of individualclassified on the basis of the standard assessment.student scores are already emerging in the fledglingtechnical literature on portfolio assessment. Con-Similarly, there was only a weak relationshipbetween contrasting procedures fir portfolioscoring: Half the students classified as compe-sider, for example, findings from two CREW'.111111111,

Portfolio Assessment: Whose Work Is It?Issues in the Use of Classroom Assignments for Accountabilitytent on the basis of the single porttblio scorewere not so classified when scores for individualpieces were averaged, though correlations between the two kinds of portfolio scor s weremoderately high (in the .6 range) Riearhart,Herman, Baker, & Whittaker, 1992, 1993;Herman, Gearhart, & Baker, 1993). In a subsequent comparative studs' of two writing rubrics, the researchers found a positive relationship between porttblio scores and standard writ-Wc.begin our discussion by examining theways in which the nature of classroomwork may undermine the validity of"individual" portfolio scores. We illustrate withCRESST data from both an evaluation of a statewide assessment program and a laboratory study ofthe scorability of elemcntary writing portfolios. Weconclude with a discussion of the implications of the"whose work" issue for portfolio assessment policyand practice.ing assessments for only one of the rubrics(Gearhart, Novak, & Herman, in press).Granted, these researchers were hampered intheir quest for validation by the paucity of technically sound, performance-based criterion measuresto which portfolio scores could be compared. Nevertheless, within the constraints set by the currentstate of the art in performance assessment, findingslike those we have illustrated do raise questionsWhose Work Is It? An Issue for the Validity ofLarge-Scale Portfolio AssessmentThe "whose work is it?" question arises becauseindividual student portfolios are constructed in asocial context. Portfolios contain the products ofclassroom instruction, and good classroom instruction according to current pedagogical and curriculumreforms involves an engaged community of practi-tioners in a supportive learning process (Camp,about the validity of porttblio scores as measures of1993; Duschl & Gitomer, 1991; Wolf, D.P., 1989;individual performance. What factors may havecontributed to these weak relationships betweenportfolio scores and on-demand assessments? NoWolf, Bixby, Glenn, & Gardner, 1991; Wolf &doubt there are many, and each will require furtherinvestigation.Gearhart, 1993a, 1993b ). Exemplary instructionalpractice, in short, supports student performance.Central to the National Writing Project, for example, is a core instructional model which featuresmultiple stagesprewriting, precomposing, writ-As summed up by a Vermont teacherafter rating portfolios for several days:"Whose work is this anyway?"ing, sharing, revising, editing and evaluation. Eachof these stages stands for instructional activities thatengage a student with resources and with othersrelated readings, classroom discussions, field trips,idea w ebs, small group collaboration, outlining,peer review, review and feedback. The sociallycontexted character of student writing is seen bothas a scafibld for students' writing process and areplication of what "real" writing entails, in thatConsider just two that focus on measurementdesign: The portfolio and on-demand assessmentsmay have tapped different domains of performancewithin a subject area; the on-demand and port tbliotasks may have differed in difficulty." The factorwriting is often a very social endeavor. Consider aswell w hat is regarded as exemplary porttblio assessthat we consider in this paper arises In un theclassroom context of porttblio assessment. Assummed up by a Vermont teacher after ratingment practice. A "pordblio culture" is viewed as"replacing . th,:entire clivelope ofassessment .ss it hextended, iterative processes, agreeing that we ,ireporttblios fin- several days: "Whose work is thisanyway?".Zill111111111115

4Portfolio Assessment: Whose Work Is It?Issues in the Use of Classroom Assignments for Accountabilityinterested in what students produce when they aregiven access to models, criticism, and the option torevise" (Wolf, D. P., 1993, p. 221). Assessmentopportunities are available at multiple classroommomentsin the course of the work that mav beadded to a portfolio, in the construction of theportfolio, and in a presentation of the portfolio,making collaboration, assessment, and revision continual processes within the classroom.These visions of an engaged community of learners and reviewers have implications for the validityof classroom portfolios for large-scale assessmentpurposes: The more developed the community, themore engaged others will be in the work taggedwith an individual student's name. While :.he locusofauthorship ma!, shift outward from the individualstudent to tilt- c:immunity of writers, the shift isunlikely to be systematic: Others' contributions tostudents' work are likely to vary across assignments,students, and classrooms. An irony emerges thatwhen the student's work is more her own, that workmay index practices and curriculum that lack certainkey features of current reforms.How is a rater unfamiliar with a student orthe classroom context to assign an indi-vidual student a score for a portfoliocollection that includes assisted or collaborativework? Research by Webb (1993) suggests that anindividual's performance in the context of groupactivity may or may not represent his or her capability. Her finding, for example, that low-abilitystudents had higher scores on the basis of groupwork than on individual work suggests that a rater'sscore for a porfiblio may overestimate student performance because it constitutes a rafing of eftbrtsthat were assisted. Alternatively, the rater who isaware that work is assisted ma% adjust downwardthe individual's scoreigain biasing the rating.-Whose Work Is It? Data From CRESSTStudiesWhile question, regarding the roles of authorshipand assisted performance in large-scale portfolioassessment have been raised (Condon & HampLyons, 1991; Gitomer, personal communication,September, 1994; Herman, et al., 1993; Koretz,McCaffrey, Klein, Bell, & Steelier, 1993; Koretz,Stecher, & Deibert, 1992; Koretz, personal communication, September, 1994; Stecher & Hamilton,1994), they have been neither directly investigatedThese studies suggest substantialvariability in instructional support forstudents' work.nor widely discussed. As we discuss next, however,preliminary results from the CRESST Vermontstudies (Koretz, Stecher, Klein, & McCaffrey, inpress) and the laboratory-based studies of portfolioratability (Gearhart et al., 1992; Gearhart, Herman,Novak, Wolf, & Abedi, 1994; Herman et al., 1993)add some empirical basis for concern. These studiessuggest substantial variability in instructional support for students' work, variability which may wellcompromise the meaning and comparability ofscoreswithin as well as between classrooms and schools.VermontWhile the RANI) evaluation addresses three broadissuesthe actual implementation of the programin schools and classrooms, the program's diverseeffects, and the quality ofthe information yielded bythe assessmentof interest here are results from asurvey distributed to all fburth- and eighth-grademath teachers during the second year (1992-93) ofVermont's statewide implementation.' Results arebased on the responses of approximately 52% of t hemathematics teachers at Grade 4 ( N 382) and 41%at Grade 8 ( N 137) (p. 6 ).

5Portfolio Assessment: Whose Work Is It?J,sues in the Use of Classroom Assignments for Accountability15% of classrooms students took more than, one fullclass period to revise a best piece. Provision of timeand support tin revision clearly represents an aid toTeachers' responses to a number of quest ions indicated substantial variation in howinathematics portfOlios were implementedacross classrooms, and consequently substantialvariation in how much help and support studentsreceived in putting their -best face tOrward" for theporttblio assessment. Teachers' reported policiesperfiwmanceind thus students who are not encouraged to revise their best pieces may well be ata disadvantage relative to those ,tudents who areprovided greater opportunities to revise.There also was considerable variation in teachers'policies regarding who \% as permitted to assist students in revising their best pieces Table 1 ). One inon revising best pieces are a first case in point:Although more teachers ClICOUraffed revision ofmost best pieces i 57% 1, many teachers departedfrom this pattern by either requirim revision (19%),simply permittingit ( 19%), or generally prohibiiingfour teachers did not report assisting their ownstudents in revisions, and a similar proportion didnot report permitting students to help each other.it (5%). Similarly, the amount of time staientsspent revising varied widely. The average time inrevision was 30-40 minutes, but in 17% of class-Seventy percent offourth-grade teachers and 39% ofeighth-grade teachers fiwbade parental or otheroutside assistance. Further complicating these find-rooms students did not revise at all, and in anotherTable 1Assistance Allowed by Teachers on Best Pieces(Percentage of Teachers)Assistance allowed onwhich best pieces?SourceTeacherGradeNone Some134MostRules differforindividualAll 13Parents orothers out-4,17113448839288131131side schoolGradckfiticrenkstgniticant at the 5% lesci (p.:.0:-;)-,11111111111117

6Portfolio Assessment: Whose Work Is It?Issues in the Use of Classroom Assignments for Accountabilityings regarding classroom variation, roughly 10% ofteachers reported that their policies regarding assistance varied for different students within theirclassrooms. Teachers' policies also differed withrespect to acknowledgment of outside help. Onlyabout 20% of teachers required their students toacknowledge or describe the assistance they received, and, therefore, the raters of most students'portfolios would not know who contributed to theentries or the nature of their assistance.Finally, the Vermont teachers reported substantially different degrees of influence onstudents' choices of "best pieces" for theirportfolios (Table 2): Some teachers reported play-ing an equal role with their students in makingportfolio selections, while others reported no role atall. Certainly thc type and quality of the work thatbecomes part of a student's portfolio can be influenced by who selects the pieces for inclusion. Inparticular, since teachers presumably have a betterunderstanding of the scoring criteria than do students, the portfolios of students whose choices wereassisted by their teachers may be more likely to showstudents' capabilities.Thus the RAND/CRESST study found sizablevariations among classrooms in factors such as theamount of revision that was permitted and theextent to which teachers limited assistance fromothers. These implementation findings may help toexplain the weak patterns of relationships betweenportfolio scores and on-demand assessments, relationships decribed earlier. If some teachers provide(directly or through other adults or students) morehelp than others, comparisons among the portfolioscores of their students would be clouded by thecontributions that others make to a given student'sportfolio. Because such factors enter only intoportfolio scores and not into scores on a standardized, on-demand assessment, they would tend toweakcn the relationships between portfolio scoresand scores from on-demand assessments.Table 2Who Selects Best Pieces? (Percentage of Teachers)Who selects best pieces?Grade 43Grade 8Students on their own2130Students with limitedteacher input5557Students and teachershave equal role1885311Teacher with limitedstudent inputTeachera Grade-level dif ference Agnificant at the 5% level (p .05

Portfolio Assessment: Whose Work Is It?Issues in the Use of Classroom Assignments for AccountabilityCRESST Laboratory Studies of the Scorabilityof Writing PorttbliosTo document the contributions of others to thewriting contained within students' writing porab-port ) to ?, (teacher has specified the requirement inlios, Gearhart et al. (1993V asked teachersfrom peers or from direct modeling by a teacher orparent ) and to estimate the time the child spent onthe assignment in hours or fractional parts of hours.The dataset consisted of spring 1991 ratings of 228assignments from a total of 54 students. Thenumber of assignments per student ranged from 1ratethe level of their instructional support tbr theirwriting assignments. Data were collected in thespring of 1991 from nine teachers spanning GradesI to 6. Each teacher was asked to designate twostudents at each of three levels of writing competency ( high, medium, and low ), to collect completeportfolios of all of their work, and for each writingassignment to document the instructional supportprovided during the composing and editing phases.detail ).Teachers also were asked to rate eachass*nment in terms of Copied work ( the extent towhich the student's work appeared to be copiedto 21, with a modal number of 3. (One teacherreturned 14-21 assignments per target student,compared with 1-5 for the remaining eight teachers.m) Across all assignments, teachers reportedproviding generally low to moderate levels of sup-Ratings were keyed to the same dimensionsport to their target students, Kit their reportedused at that time to assess students' writ'ng progress ( Baker, Gearhart, & Herman,1991): Content/Organization (topic/subtopics ortheme, and their structure and development); Style(elements of text like descriptive language, wordchoice, sentence choice, tone, mood, voice, andsupport differed substantially among students' competency levels: Teachers were far more likely toaudience ); and Mechanics( spelling, grammar, punk:-to students' portIblios.The patterns of teachers' reported support dif-tuationmd other conventions). The scale pointswere defined along a continuum from 0 ( no sup-report providing higher levels of support to their"low" studmts than to their more able students(Table 3), a finding that raises concerns about thedifferential meaning of scores that may be assignedfered across the three writing dimensions, reflecting,Table 3Percentage of Teachrs Reporting Greater Supportby Writing Dimension and Student Ability LevelStudent ability nics2660Writing dimensionNote."greater" level of support was defined as rahngs of 2 or3, where 2 indicated some guidelines and feedbackind 3 represented di tailed guidelines and tixdback.441i9--r-

8Portfolio Assessment: Whose Work Is It?Issues in the Use of Classroom Assignments for Accountabilityit seems, variations in curriculum. In Table 4, we seethat teachers' experience with portfolio assessmentwas related to their patterns of instructional support. The three teachers who had been usingportfolios in their classrooms for over a V car tendedto report providing higher levels ofsupport than didthe six teachers who had just begun experimentingwith portfolio assessment, and we believe that themore experienced teachers engagement with a writing process approach contributed to their greaterinvolvement with students' assignments ( and/or totheir greater perceptions of involvement ). Table 5hints at the ways that teachers' reported levels ofccrned with mechanics, the fifth-grade teacher moreconcerned with style.Thus teachers in the Gearhart et al. (1993) studyreported variations in instructional practices thatwere likely to have impacted dilkrentially thc quality ofstudent work in the portfolios. As in Vermont,these implementation findings may help to explainthe weak relationships between studems' portfolioscores and their standard writing assessments.Reflections and RecommendationsTeacher self-report data from two CRESSTstudies have produced evidence of variation in howsupport may be related to grade level as well asportfolio work is produced and supported. Weportfolio experience. While these data are purely anillustration from a very small dataset, we see heretwo second-grade teachers providing quite difkrentlevels of assistance with style vs. mechanics, and twofifth-grade teachers differing more in levels of sup-acknowledge the flaws ofthe preliminary self-reportdata that we have presented and fully recognize thatfurther research is neededstudies that employFurthermore, the second- andlarger sample sizes and multiple methodologies toverify the variety of support provided to studentsand the impact of such support on assessed perfor-fifth-grade teachers with a year of portfolio experi-mance. But iffindings like these can be substantiatedence reported emphases on different writingin more systematic research, they suggest that thequality of student work reflects not only a student'sport for style.dimensionsthe second-grade teacher more con-Table 4Comparison of Teachers With Little vs. One Year ExperienceWith Portfolios: Percentage of Assignments Given GreaterSupport by Writing Process Little ( n 6)544136One year (n 3)928274Portfolio experienceNote. A -great cr" level of support was defined as ratings of 2 or 3, where2 indicated some guidelines and feedback, and 3 represented di toiled guidelines and feedback.

9Portfolio Assessment: Whose Work Is It?Issues in the Use of Classroom Assignments for AccountabilityTable 5Ilhistrative Comparison of Selected Teachers:Percentage of Assignments Given Greater Support byWriting Process DimensionDimension'Feacher's grade lex elportfolio experience Focus; organization StyleSecond gradeLittle experiencewith porttOliosNlechanics835867651810050.7921One year experiencewith porttOliosFifth gradeLittle experiencewith portfoliosOne year experiencewith portfolios7244Note. A "greater" level of support ss as defined as ratings of 2 or 3, %%here 2indicated SOW guidelines and feedback, and 3 represented detailed guidelines and feedback.competence but also the amount and quality ofwhose work is this article, for example? In whatsupport received from others. Thus, whose work isthe classroom work contained in a student's porttOlio? From the preliminary evidence presented here,ss avs does it reflect the writing and research competencies of either of its authors? We value (Mr oss nopportunities for collaborative work as much as weit seems it may dependon students' competenceand a range of variable circumstances: teachers'methods of instruction, the nature of their assign-value the efforts to engage students in authenticcommunities in the classroom. But, from a mea-ments, peer and other rest MCCCS available in theclassroommd home support.What meaning, then, can a large scaleporfolio assessment program ascribe tostudent work contained inIN wtroliocollections? A professional --ss hether a w ricer. scientist,oreducat it mal researcher --ss ho is accustomedto others' input may respond to this questitm ss it hphilosophical reflection or an identity crisis. Indeed,11surement perspective, the validity oeinferences aboutstudent competence based solely tin portfolio workappears suspect. While this is not a gras c concernfor classroom assessment where teachers and students can judge perfOrmances s ith knowledge oftheir context, the pmblem is troubling indeed forlarge-scale assessment purposes w here'comparabilitv of data is an issue. Under w hat circumsta,,, es,then, can port ft ihto assessments be used to rank ormake seritms decisions about students, teachers,schools, or districts?

10Portfolio Assessment: Whose Work Is It?Issues in the Use of Classroom Assignments for AccountabilityThe question requires attention to ( a) the purposes of portfolio assessment, (b ) the integrateddesign of portfolio contents, rubric contents, ratingprocedures, and uses of the resultsmd (c) a recognition ofpossible conflicts between the measurementand instructional aims of portfolio assessment. Theapparent inverse relationship between support andstudents' ability level in the Gearhart et al. (1993 )study is a telling example in this regard. Certainlyif low-performing students are to achieve high stan-dards, itislikely they will need an enrichedinstructional process to give them the capability tbrtransferable performanceample models, coachingand mentoring, and multiple opportunities for practice, feedback and revision. But if the work thatemerg from this same instructional process is usedto assess students' individual pertbrmance, thenthere will be problems of comparability of scoresacross students. Can we bridge this apparent gulfbetween what is required to serve the purposes ofclassroom instruction and large-scale accountabil-- Restrictions Lould be imposed on w ork siLldents produce tiir their porublios, ntrolliiwho is permitted to pn wide assistance andunder what circumstances. These pn iced ures.largely rejected as violations of instructionalfreedom, w ould require critication that thecontrols on assistance w ere in place.Portfolios could be "seeded- w ith students'1responses to a standard peril wmance based w fling assessment; ratings ot ihew entries might beused to adjust overall pornblio sioreS, or toraise "red tlags- when scores tbr standard assessments are discrepant w ith other portfoliomaterial. But this Option would bring additional complications.Portfolio procedures could incorporate strategies fbr documenting others' assistance and input.ity?While no easy solutions come to mind, itdoes appear that any valid assignment ofan individual student score to a porttblio for large-scale purposes will require proceduresto highlight the student's contribution to the work.Adjustments in either composition of the portfolioFirst, many perfbrmance -based assessments ofwriting and reading currently inc

Marvl Gearhart and Joan L. Hermani. Center for the Study of Evaluation. National Center tbr Research on Evaluation, Standardsmd Student Testing. University of California, Los Angeles. T0 many engaged in educational reform, portfolio assessment captures a vision of assessment integrated with instruction. Concerned about the equity and validity .