Developing Self-Assessment through Project-Based Learning

Well-developed self-assessment skills are not only a vital tool for improving student learning, but also a lifelong tool for growth beyond the school. When students practice peer assessment, they become more skilled at critiquing in general and, by extension, better at self-critique. I believe that project-based learning provides rich opportunities for developing students’ assessment skills not only in the process of refining their projects, but also in the presentations of learning that often take place at the culmination of a project, perhaps before an audience appropriate to the project’s focus that is able to give authentic qualitative feedback.

The Ontario Ministry of Education’s 2010 publication on assessment, Growing Success, states: “The primary purpose of assessment is to improve student learning” (6, 28). Yes! However, we as Christian teachers (indeed, hopefully all teachers) have an expansive vision that is not confined to our classrooms, but extends to students’ lives beyond the school environment, lives where we hope they will continue to flourish as producers, creators, and servants. For this reason, we should remind ourselves that self-assessment is a lifelong practice, whereas peer assessment serves to support it, and teacher assessment is only a temporary scaffolding to help develop these skills.

Compliments or Assessment?

I have a confession, and I feel free to make it only because I suspect it may also be true of many others: I want praise and encouragement, but I need assessment. And this desire for applause rather than constructive feedback is probably why I feel more comfortable giving praise and encouragement to my own children, my students, and my colleagues rather than meaningful, helpful assessment. The former helps us feel better; the latter helps us do better. Both are good, of course, and have their place, but I tend to neglect the latter, which often leaves the former sounding insincere and hollow.

As Ron Berger puts it,

We can’t first build the students’ self-esteem and then focus on their work. It is through their own work that their self-esteem will grow. I don’t believe self-esteem is built from compliments. Students who are struggling or producing lousy work know exactly how poor their performance is—compliments never seem genuine. All the self-esteem activities and praise in the world won’t make them feel like proud students until they do something they can value (65).

Modeling and Teaching Qualitative Assessment

Before continuing to the classroom, allow me to give a quick disclaimer: In speaking of my own classroom experiences, I do not in any way wish to hold myself or my own practice up as an exemplar. My hope is rather that we can learn from each other’s mistakes (of which I have an embarrassingly abundant supply to relate) as well as the successes.

I’ll start with a failure, then. In the third week of this semester, I set up my grade 10 class to do some peer assessment of their first drafts of a small project they were working on. Freshly inspired by reading Ron Berger’s An Ethic of Excellence, I had visions of my students excitedly doing redrafts until they left behind their old habits of mediocrity and produced a work of quality craftsmanship, something which made them feel enormously proud and deeply satisfied. I thought I had laid out the expectations quite clearly, giving them only two types of feedback options (“I like . . .” and “I suggest . . .”), and also giving them the usual filter of “Be kind; be specific; be helpful.”

The results were terribly disappointing, with the bulk of the comments being kind, but no more specific or helpful than “I love it!” or “Great!” or the more vapid “Good work.” Mind you, there were also a few unkind, and equally unspecific and unhelpful comments like “This sucks!” or “This is dumb.” Needless to say, the next drafts showed little improvement from the first; not surprising since little direction for improvement had been given.

When I reflected on this experience, it occurred to me that my students were only doing what they had been taught by years of kind, if somewhat meaningless feedback on their work. They were using the only assessment vocabulary they knew. It was up to me to give them a new assessment language.

As Anne Davies writes in summarizing research findings from the ’70s to the ’00s, “When evaluative feedback is decreased, and specific, descriptive feedback is increased, students learn more” (Reeves 33). Yet my (admittedly anecdotal) experience with my own classes leads me to believe that most of our students better understand numbers or percentage values ascribed to work and are themselves far more adept at assigning number values to a piece of work than giving specific, qualitative feedback. Perhaps our students are overly familiar with numeric evaluation and that’s why they find it easiest to mimic. I dare say a society raised on the pursuit of higher numbers rather than qualitative excellence is more likely to chase an illusory bubble of numerical growth over real qualitative growth, but let me not stray into the territory of those better qualified to diagnose global economic ills! Returning to education then, I as a teacher need both to model and teach meaningful assessment if I expect my students to assess each other and themselves effectively. How can I do that?

The “Gallery Protocol”

A few weeks after that first disappointing foray into peer assessment with my grade 10 class (which, incidentally, was followed up with some useful class discussion on helpful assessment and some scintillating instruction, I like to think, on how to give more useful feedback), we made a second attempt.

Early in the process of a unit-long project, I launched the class optimistically again into an exercise designed for students to give feedback on each other’s project plans. I directed the students into a “Gallery Protocol” (see the description of a “gallery walk” in McDonald et al., 36), which worked as follows: Students were given five headings under which they were to elaborate briefly on their project plans. Each student (or pair or trio in some cases) wrote their plan onto a large sheet of paper and then posted the sheet on the classroom wall. Once everyone’s plans were up, the students were all given a wad of sticky notes on which they were to write their feedback and then stick it onto the relevant larger sheet on the wall. The brief was similar to the earlier peer assessment exercise (“I like . . .” and “I suggest . . .” were the two types of feedback requested, and the same filter of “Be kind; be specific; be helpful,” was applied), but they were directed to give this feedback only on the material under the last two headings on each paper. These headings were: “Highlights” and “Creative Ideas.”

The quality of the feedback was markedly different from their first experience, so much so that several students remained after the class ended to continue poring over the useful feedback they had received. Some expressed excitement about the ideas they had been given by their peers on the little sticky notes.

What made this experience different from the first? I would venture at least three reasons:

  1. The students had already had some practice at this (even if only once), and were getting better.
  2. We had taken time in class to reflect on the previous assessment experience.
  3. Instead of only asking them to be specific in their comments this time, I had also been explicit about what we were looking for feedback on; I had narrowed the focus. For the first experience, the focus was too general. I had made the mistake of asking them to respond to the whole work rather than a specified aspect or feature of it.

What did I learn from this?

First, all of us (teachers and students) need practice in order to get better at giving good, helpful, meaningful, descriptive feedback. This can happen regularly in small, low-stakes contexts rather than infrequently in high-stakes, high-stress assessment experiences.

Second, we need to set aside and direct time for reflection. We should reflect not only on our work, but also on our assessment and on our reflection itself.

And third, we should not leave too general a target if we are asking for specific feedback. Focusing the attention of the feedback, or narrowing its parameters, helps students to be more specific in their assessment. In this case, the “gallery protocol” was a useful tool for sharpening the focus. (See McDonald et al., The Power of Protocols, for many other protocol ideas.) In our school, protocols have been a powerful self- and peer-assessment tool for us as teachers, and it is both fun and exciting now to see how these can be adapted for use in the classroom to promote student reflection.

From Critical to Critique Thinking

The high school where I serve has explicitly articulated reflection as one of the habits we wish to cultivate in our graduates, and this is compatible with prioritizing self-assessment. Reflection does not mean mere introspection, nor does it refer solely to assessment of one’s own work, though it does include these. Reflection expands to imply metacognition or critical thinking, as well as critique-thinking, which I shall explain shortly.

Self-assessment as critical thinking means reflecting not only on what one has learned, but also on how one learns. In project-based learning, the goal is that presentations of learning not only include content, but also (or especially) highlight reflections on one’s learning. In other words, students should be asking themselves and answering at least the following four questions:

  1. What did I learn about this topic, this issue, this person, this problem, etc.?
  2. What did I learn from it/them?
  3. What did I learn about myself, my learning, etc.?
  4. Why is this issue/project important?

Critique thinking is a term I have coined here to describe the kind of reflection needed to sharpen our assessment skills. Critique thinking is absent in vague and vacuous feedback on a student’s work. Critique thinking is present when descriptive, qualitative feedback is clear, specific, and focused on what can be done to improve, refine, or even completely revamp student products. Critique thinking can identify in detail what is excellent in a piece of work, and also what can be improved and how. Students who have developed good critiquing skills can thus assess themselves and understand how to improve, and therefore they do improve.

Presentations of Learning

The processes discussed above fall largely into the categories of assessment for learning and assessment as learning (see Growing Success, chapter 4), but project-based learning also provides opportunity for meaningful assessment of learning, or evaluation (see Growing Success, chapter 5).

Our school continues to toy with venues and audiences for our students to present the products of their projects. The idea is that if student work is produced for the teacher alone, it has little life and meaning for the student beyond the grade the teacher gives. Thus, an important part of project-based learning becomes finding an authentic audience for student products or presentations. When the connection is too contrived, the result is low student investment in the quality of their final products, but when a project has meaning and significance for the student beyond the classroom, the teacher-given grade becomes almost incidental and certainly not the primary motivator for student excellence.

When high school students produce a booklet for grade 1 students, the grade 1 class becomes the authentic audience, and their reaction to the product (their excitement and enjoyment, or boredom and disinterest) is more authentic assessment (see Chen 79) for the students than the number their teacher will put on their work. In my experience, this leads to more reflection on what worked well and what didn’t in student products.

For a community garden project, the most meaningful evaluation for the students as a collective may simply be the yield—how much food they were able to harvest and bring to a local food bank. Along the way, of course, experts, parents, peers, and others may have provided the specific and descriptive feedback that helped them reach their optimum product in terms of yield.

For a project planning and designing a green walk around the school grounds, the presentation of a proposal to the board of directors might be the authentic audience, and approval to go ahead the most significant evaluation. An authentic assessment will likely continue well beyond the life of the project in the classroom, as students see their projects coming to fruition in the years ahead.

Other classes, school assemblies, parents, a local specialist or panel of experts (engineers, environmental consultants, financial investors, builders, artists, landscapers, musicians, or practitioners in whatever field is relevant to the project, local municipalities, blogs, and online forums, YouTube, newspapers, or other public communications—any of these can provide authentic audiences for student presentations of learning.

As we educators broaden the spectrum of assessment opportunities for our students beyond the teacher’s red pen, and as we model and lead them in reflection following these experiences, we are setting our students up to be lifelong learners, growers, critical-thinkers, creators, producers, and servants. In short, we are preparing them to flourish.

Works Cited

  • Berger, Ron. An Ethic of Excellence: Building a Culture of Craftsmanship with Students. Portsmouth, NH: Heinemann, 2003.
  • Chen, Milton. Education Nation: Six Leading Edges of Innovation in our Schools. San Francisco, CA: Jossey-Bass, 2010.
  • McDonald, Joseph P., Nancy Mohr, Alan Dichter, and Elizabeth C. McDonald. The Power of Protocols: An Educator’s Guide to Better Practice (2nd Edition). New York: Teachers College Press, 2007.
  • Ontario Ministry of Education. Growing Success: Assessment, Evaluation, and Reporting in Ontario Schools (1st Edition). Toronto: Ministry of Education, 2010.
  • Reeves, Douglas, ed. Ahead of the Curve: The Power of Assessment to Transform Teaching and Learning. Bloomington, IN: Solution Tree, 2007.